Jan 22 16:13:25 np0005592767 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 22 16:13:25 np0005592767 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 22 16:13:25 np0005592767 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:13:25 np0005592767 kernel: BIOS-provided physical RAM map:
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 22 16:13:25 np0005592767 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 22 16:13:25 np0005592767 kernel: NX (Execute Disable) protection: active
Jan 22 16:13:25 np0005592767 kernel: APIC: Static calls initialized
Jan 22 16:13:25 np0005592767 kernel: SMBIOS 2.8 present.
Jan 22 16:13:25 np0005592767 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 22 16:13:25 np0005592767 kernel: Hypervisor detected: KVM
Jan 22 16:13:25 np0005592767 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 22 16:13:25 np0005592767 kernel: kvm-clock: using sched offset of 3722208619 cycles
Jan 22 16:13:25 np0005592767 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 22 16:13:25 np0005592767 kernel: tsc: Detected 2800.000 MHz processor
Jan 22 16:13:25 np0005592767 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 22 16:13:25 np0005592767 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 22 16:13:25 np0005592767 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 22 16:13:25 np0005592767 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 22 16:13:25 np0005592767 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 22 16:13:25 np0005592767 kernel: Using GB pages for direct mapping
Jan 22 16:13:25 np0005592767 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 22 16:13:25 np0005592767 kernel: ACPI: Early table checksum verification disabled
Jan 22 16:13:25 np0005592767 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 22 16:13:25 np0005592767 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:13:25 np0005592767 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:13:25 np0005592767 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:13:25 np0005592767 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 22 16:13:25 np0005592767 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:13:25 np0005592767 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 22 16:13:25 np0005592767 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 22 16:13:25 np0005592767 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 22 16:13:25 np0005592767 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 22 16:13:25 np0005592767 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 22 16:13:25 np0005592767 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 22 16:13:25 np0005592767 kernel: No NUMA configuration found
Jan 22 16:13:25 np0005592767 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 22 16:13:25 np0005592767 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 22 16:13:25 np0005592767 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 22 16:13:25 np0005592767 kernel: Zone ranges:
Jan 22 16:13:25 np0005592767 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 22 16:13:25 np0005592767 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 22 16:13:25 np0005592767 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 16:13:25 np0005592767 kernel:  Device   empty
Jan 22 16:13:25 np0005592767 kernel: Movable zone start for each node
Jan 22 16:13:25 np0005592767 kernel: Early memory node ranges
Jan 22 16:13:25 np0005592767 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 22 16:13:25 np0005592767 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 22 16:13:25 np0005592767 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 22 16:13:25 np0005592767 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 22 16:13:25 np0005592767 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 22 16:13:25 np0005592767 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 22 16:13:25 np0005592767 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 22 16:13:25 np0005592767 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 22 16:13:25 np0005592767 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 22 16:13:25 np0005592767 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 22 16:13:25 np0005592767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 22 16:13:25 np0005592767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 22 16:13:25 np0005592767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 22 16:13:25 np0005592767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 22 16:13:25 np0005592767 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 22 16:13:25 np0005592767 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 22 16:13:25 np0005592767 kernel: TSC deadline timer available
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Max. logical packages:   8
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Max. logical dies:       8
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Max. dies per package:   1
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Max. threads per core:   1
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Num. cores per package:     1
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Num. threads per package:   1
Jan 22 16:13:25 np0005592767 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 22 16:13:25 np0005592767 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 22 16:13:25 np0005592767 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 22 16:13:25 np0005592767 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 22 16:13:25 np0005592767 kernel: Booting paravirtualized kernel on KVM
Jan 22 16:13:25 np0005592767 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 22 16:13:25 np0005592767 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 22 16:13:25 np0005592767 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 22 16:13:25 np0005592767 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 22 16:13:25 np0005592767 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:13:25 np0005592767 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 22 16:13:25 np0005592767 kernel: random: crng init done
Jan 22 16:13:25 np0005592767 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: Fallback order for Node 0: 0 
Jan 22 16:13:25 np0005592767 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 22 16:13:25 np0005592767 kernel: Policy zone: Normal
Jan 22 16:13:25 np0005592767 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 22 16:13:25 np0005592767 kernel: software IO TLB: area num 8.
Jan 22 16:13:25 np0005592767 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 22 16:13:25 np0005592767 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 22 16:13:25 np0005592767 kernel: ftrace: allocated 194 pages with 3 groups
Jan 22 16:13:25 np0005592767 kernel: Dynamic Preempt: voluntary
Jan 22 16:13:25 np0005592767 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 22 16:13:25 np0005592767 kernel: rcu: #011RCU event tracing is enabled.
Jan 22 16:13:25 np0005592767 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 22 16:13:25 np0005592767 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 22 16:13:25 np0005592767 kernel: #011Rude variant of Tasks RCU enabled.
Jan 22 16:13:25 np0005592767 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 22 16:13:25 np0005592767 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 22 16:13:25 np0005592767 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 22 16:13:25 np0005592767 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:13:25 np0005592767 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:13:25 np0005592767 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 22 16:13:25 np0005592767 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 22 16:13:25 np0005592767 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 22 16:13:25 np0005592767 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 22 16:13:25 np0005592767 kernel: Console: colour VGA+ 80x25
Jan 22 16:13:25 np0005592767 kernel: printk: console [ttyS0] enabled
Jan 22 16:13:25 np0005592767 kernel: ACPI: Core revision 20230331
Jan 22 16:13:25 np0005592767 kernel: APIC: Switch to symmetric I/O mode setup
Jan 22 16:13:25 np0005592767 kernel: x2apic enabled
Jan 22 16:13:25 np0005592767 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 22 16:13:25 np0005592767 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 22 16:13:25 np0005592767 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 22 16:13:25 np0005592767 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 22 16:13:25 np0005592767 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 22 16:13:25 np0005592767 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 22 16:13:25 np0005592767 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 22 16:13:25 np0005592767 kernel: Spectre V2 : Mitigation: Retpolines
Jan 22 16:13:25 np0005592767 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 22 16:13:25 np0005592767 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 22 16:13:25 np0005592767 kernel: RETBleed: Mitigation: untrained return thunk
Jan 22 16:13:25 np0005592767 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 22 16:13:25 np0005592767 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 22 16:13:25 np0005592767 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 22 16:13:25 np0005592767 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 22 16:13:25 np0005592767 kernel: x86/bugs: return thunk changed
Jan 22 16:13:25 np0005592767 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 22 16:13:25 np0005592767 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 22 16:13:25 np0005592767 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 22 16:13:25 np0005592767 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 22 16:13:25 np0005592767 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 22 16:13:25 np0005592767 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 22 16:13:25 np0005592767 kernel: Freeing SMP alternatives memory: 40K
Jan 22 16:13:25 np0005592767 kernel: pid_max: default: 32768 minimum: 301
Jan 22 16:13:25 np0005592767 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 22 16:13:25 np0005592767 kernel: landlock: Up and running.
Jan 22 16:13:25 np0005592767 kernel: Yama: becoming mindful.
Jan 22 16:13:25 np0005592767 kernel: SELinux:  Initializing.
Jan 22 16:13:25 np0005592767 kernel: LSM support for eBPF active
Jan 22 16:13:25 np0005592767 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 22 16:13:25 np0005592767 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 22 16:13:25 np0005592767 kernel: ... version:                0
Jan 22 16:13:25 np0005592767 kernel: ... bit width:              48
Jan 22 16:13:25 np0005592767 kernel: ... generic registers:      6
Jan 22 16:13:25 np0005592767 kernel: ... value mask:             0000ffffffffffff
Jan 22 16:13:25 np0005592767 kernel: ... max period:             00007fffffffffff
Jan 22 16:13:25 np0005592767 kernel: ... fixed-purpose events:   0
Jan 22 16:13:25 np0005592767 kernel: ... event mask:             000000000000003f
Jan 22 16:13:25 np0005592767 kernel: signal: max sigframe size: 1776
Jan 22 16:13:25 np0005592767 kernel: rcu: Hierarchical SRCU implementation.
Jan 22 16:13:25 np0005592767 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 22 16:13:25 np0005592767 kernel: smp: Bringing up secondary CPUs ...
Jan 22 16:13:25 np0005592767 kernel: smpboot: x86: Booting SMP configuration:
Jan 22 16:13:25 np0005592767 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 22 16:13:25 np0005592767 kernel: smp: Brought up 1 node, 8 CPUs
Jan 22 16:13:25 np0005592767 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 22 16:13:25 np0005592767 kernel: node 0 deferred pages initialised in 8ms
Jan 22 16:13:25 np0005592767 kernel: Memory: 7763552K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618368K reserved, 0K cma-reserved)
Jan 22 16:13:25 np0005592767 kernel: devtmpfs: initialized
Jan 22 16:13:25 np0005592767 kernel: x86/mm: Memory block size: 128MB
Jan 22 16:13:25 np0005592767 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 22 16:13:25 np0005592767 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 22 16:13:25 np0005592767 kernel: pinctrl core: initialized pinctrl subsystem
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 22 16:13:25 np0005592767 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 22 16:13:25 np0005592767 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 22 16:13:25 np0005592767 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 22 16:13:25 np0005592767 kernel: audit: initializing netlink subsys (disabled)
Jan 22 16:13:25 np0005592767 kernel: audit: type=2000 audit(1769116403.920:1): state=initialized audit_enabled=0 res=1
Jan 22 16:13:25 np0005592767 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 22 16:13:25 np0005592767 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 22 16:13:25 np0005592767 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 22 16:13:25 np0005592767 kernel: cpuidle: using governor menu
Jan 22 16:13:25 np0005592767 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 22 16:13:25 np0005592767 kernel: PCI: Using configuration type 1 for base access
Jan 22 16:13:25 np0005592767 kernel: PCI: Using configuration type 1 for extended access
Jan 22 16:13:25 np0005592767 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 22 16:13:25 np0005592767 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 22 16:13:25 np0005592767 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 22 16:13:25 np0005592767 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 22 16:13:25 np0005592767 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 22 16:13:25 np0005592767 kernel: Demotion targets for Node 0: null
Jan 22 16:13:25 np0005592767 kernel: cryptd: max_cpu_qlen set to 1000
Jan 22 16:13:25 np0005592767 kernel: ACPI: Added _OSI(Module Device)
Jan 22 16:13:25 np0005592767 kernel: ACPI: Added _OSI(Processor Device)
Jan 22 16:13:25 np0005592767 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 22 16:13:25 np0005592767 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 22 16:13:25 np0005592767 kernel: ACPI: Interpreter enabled
Jan 22 16:13:25 np0005592767 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 22 16:13:25 np0005592767 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 22 16:13:25 np0005592767 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 22 16:13:25 np0005592767 kernel: PCI: Using E820 reservations for host bridge windows
Jan 22 16:13:25 np0005592767 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 22 16:13:25 np0005592767 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [3] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [4] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [5] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [6] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [7] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [8] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [9] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [10] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [11] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [12] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [13] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [14] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [15] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [16] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [17] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [18] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [19] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [20] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [21] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [22] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [23] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [24] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [25] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [26] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [27] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [28] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [29] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [30] registered
Jan 22 16:13:25 np0005592767 kernel: acpiphp: Slot [31] registered
Jan 22 16:13:25 np0005592767 kernel: PCI host bridge to bus 0000:00
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 22 16:13:25 np0005592767 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 22 16:13:25 np0005592767 kernel: iommu: Default domain type: Translated
Jan 22 16:13:25 np0005592767 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 22 16:13:25 np0005592767 kernel: SCSI subsystem initialized
Jan 22 16:13:25 np0005592767 kernel: ACPI: bus type USB registered
Jan 22 16:13:25 np0005592767 kernel: usbcore: registered new interface driver usbfs
Jan 22 16:13:25 np0005592767 kernel: usbcore: registered new interface driver hub
Jan 22 16:13:25 np0005592767 kernel: usbcore: registered new device driver usb
Jan 22 16:13:25 np0005592767 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 22 16:13:25 np0005592767 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 22 16:13:25 np0005592767 kernel: PTP clock support registered
Jan 22 16:13:25 np0005592767 kernel: EDAC MC: Ver: 3.0.0
Jan 22 16:13:25 np0005592767 kernel: NetLabel: Initializing
Jan 22 16:13:25 np0005592767 kernel: NetLabel:  domain hash size = 128
Jan 22 16:13:25 np0005592767 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 22 16:13:25 np0005592767 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 22 16:13:25 np0005592767 kernel: PCI: Using ACPI for IRQ routing
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 22 16:13:25 np0005592767 kernel: vgaarb: loaded
Jan 22 16:13:25 np0005592767 kernel: clocksource: Switched to clocksource kvm-clock
Jan 22 16:13:25 np0005592767 kernel: VFS: Disk quotas dquot_6.6.0
Jan 22 16:13:25 np0005592767 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 22 16:13:25 np0005592767 kernel: pnp: PnP ACPI init
Jan 22 16:13:25 np0005592767 kernel: pnp: PnP ACPI: found 5 devices
Jan 22 16:13:25 np0005592767 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_INET protocol family
Jan 22 16:13:25 np0005592767 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 22 16:13:25 np0005592767 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_XDP protocol family
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 22 16:13:25 np0005592767 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 22 16:13:25 np0005592767 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 22 16:13:25 np0005592767 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 78702 usecs
Jan 22 16:13:25 np0005592767 kernel: PCI: CLS 0 bytes, default 64
Jan 22 16:13:25 np0005592767 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 22 16:13:25 np0005592767 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 22 16:13:25 np0005592767 kernel: ACPI: bus type thunderbolt registered
Jan 22 16:13:25 np0005592767 kernel: Trying to unpack rootfs image as initramfs...
Jan 22 16:13:25 np0005592767 kernel: Initialise system trusted keyrings
Jan 22 16:13:25 np0005592767 kernel: Key type blacklist registered
Jan 22 16:13:25 np0005592767 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 22 16:13:25 np0005592767 kernel: zbud: loaded
Jan 22 16:13:25 np0005592767 kernel: integrity: Platform Keyring initialized
Jan 22 16:13:25 np0005592767 kernel: integrity: Machine keyring initialized
Jan 22 16:13:25 np0005592767 kernel: Freeing initrd memory: 87956K
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_ALG protocol family
Jan 22 16:13:25 np0005592767 kernel: xor: automatically using best checksumming function   avx       
Jan 22 16:13:25 np0005592767 kernel: Key type asymmetric registered
Jan 22 16:13:25 np0005592767 kernel: Asymmetric key parser 'x509' registered
Jan 22 16:13:25 np0005592767 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 22 16:13:25 np0005592767 kernel: io scheduler mq-deadline registered
Jan 22 16:13:25 np0005592767 kernel: io scheduler kyber registered
Jan 22 16:13:25 np0005592767 kernel: io scheduler bfq registered
Jan 22 16:13:25 np0005592767 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 22 16:13:25 np0005592767 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 22 16:13:25 np0005592767 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 22 16:13:25 np0005592767 kernel: ACPI: button: Power Button [PWRF]
Jan 22 16:13:25 np0005592767 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 22 16:13:25 np0005592767 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 22 16:13:25 np0005592767 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 22 16:13:25 np0005592767 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 22 16:13:25 np0005592767 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 22 16:13:25 np0005592767 kernel: Non-volatile memory driver v1.3
Jan 22 16:13:25 np0005592767 kernel: rdac: device handler registered
Jan 22 16:13:25 np0005592767 kernel: hp_sw: device handler registered
Jan 22 16:13:25 np0005592767 kernel: emc: device handler registered
Jan 22 16:13:25 np0005592767 kernel: alua: device handler registered
Jan 22 16:13:25 np0005592767 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 22 16:13:25 np0005592767 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 22 16:13:25 np0005592767 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 22 16:13:25 np0005592767 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 22 16:13:25 np0005592767 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 22 16:13:25 np0005592767 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 22 16:13:25 np0005592767 kernel: usb usb1: Product: UHCI Host Controller
Jan 22 16:13:25 np0005592767 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 22 16:13:25 np0005592767 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 22 16:13:25 np0005592767 kernel: hub 1-0:1.0: USB hub found
Jan 22 16:13:25 np0005592767 kernel: hub 1-0:1.0: 2 ports detected
Jan 22 16:13:25 np0005592767 kernel: usbcore: registered new interface driver usbserial_generic
Jan 22 16:13:25 np0005592767 kernel: usbserial: USB Serial support registered for generic
Jan 22 16:13:25 np0005592767 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 22 16:13:25 np0005592767 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 22 16:13:25 np0005592767 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 22 16:13:25 np0005592767 kernel: mousedev: PS/2 mouse device common for all mice
Jan 22 16:13:25 np0005592767 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 22 16:13:25 np0005592767 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 22 16:13:25 np0005592767 kernel: rtc_cmos 00:04: registered as rtc0
Jan 22 16:13:25 np0005592767 kernel: rtc_cmos 00:04: setting system clock to 2026-01-22T21:13:24 UTC (1769116404)
Jan 22 16:13:25 np0005592767 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 22 16:13:25 np0005592767 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 22 16:13:25 np0005592767 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 22 16:13:25 np0005592767 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 22 16:13:25 np0005592767 kernel: usbcore: registered new interface driver usbhid
Jan 22 16:13:25 np0005592767 kernel: usbhid: USB HID core driver
Jan 22 16:13:25 np0005592767 kernel: drop_monitor: Initializing network drop monitor service
Jan 22 16:13:25 np0005592767 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 22 16:13:25 np0005592767 kernel: Initializing XFRM netlink socket
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_INET6 protocol family
Jan 22 16:13:25 np0005592767 kernel: Segment Routing with IPv6
Jan 22 16:13:25 np0005592767 kernel: NET: Registered PF_PACKET protocol family
Jan 22 16:13:25 np0005592767 kernel: mpls_gso: MPLS GSO support
Jan 22 16:13:25 np0005592767 kernel: IPI shorthand broadcast: enabled
Jan 22 16:13:25 np0005592767 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 22 16:13:25 np0005592767 kernel: AES CTR mode by8 optimization enabled
Jan 22 16:13:25 np0005592767 kernel: sched_clock: Marking stable (1182004600, 150768220)->(1455179480, -122406660)
Jan 22 16:13:25 np0005592767 kernel: registered taskstats version 1
Jan 22 16:13:25 np0005592767 kernel: Loading compiled-in X.509 certificates
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 22 16:13:25 np0005592767 kernel: Demotion targets for Node 0: null
Jan 22 16:13:25 np0005592767 kernel: page_owner is disabled
Jan 22 16:13:25 np0005592767 kernel: Key type .fscrypt registered
Jan 22 16:13:25 np0005592767 kernel: Key type fscrypt-provisioning registered
Jan 22 16:13:25 np0005592767 kernel: Key type big_key registered
Jan 22 16:13:25 np0005592767 kernel: Key type encrypted registered
Jan 22 16:13:25 np0005592767 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 22 16:13:25 np0005592767 kernel: Loading compiled-in module X.509 certificates
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 22 16:13:25 np0005592767 kernel: ima: Allocated hash algorithm: sha256
Jan 22 16:13:25 np0005592767 kernel: ima: No architecture policies found
Jan 22 16:13:25 np0005592767 kernel: evm: Initialising EVM extended attributes:
Jan 22 16:13:25 np0005592767 kernel: evm: security.selinux
Jan 22 16:13:25 np0005592767 kernel: evm: security.SMACK64 (disabled)
Jan 22 16:13:25 np0005592767 kernel: evm: security.SMACK64EXEC (disabled)
Jan 22 16:13:25 np0005592767 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 22 16:13:25 np0005592767 kernel: evm: security.SMACK64MMAP (disabled)
Jan 22 16:13:25 np0005592767 kernel: evm: security.apparmor (disabled)
Jan 22 16:13:25 np0005592767 kernel: evm: security.ima
Jan 22 16:13:25 np0005592767 kernel: evm: security.capability
Jan 22 16:13:25 np0005592767 kernel: evm: HMAC attrs: 0x1
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 22 16:13:25 np0005592767 kernel: Running certificate verification RSA selftest
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 22 16:13:25 np0005592767 kernel: Running certificate verification ECDSA selftest
Jan 22 16:13:25 np0005592767 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 22 16:13:25 np0005592767 kernel: clk: Disabling unused clocks
Jan 22 16:13:25 np0005592767 kernel: Freeing unused decrypted memory: 2028K
Jan 22 16:13:25 np0005592767 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 22 16:13:25 np0005592767 kernel: Write protecting the kernel read-only data: 30720k
Jan 22 16:13:25 np0005592767 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: Manufacturer: QEMU
Jan 22 16:13:25 np0005592767 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 22 16:13:25 np0005592767 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 22 16:13:25 np0005592767 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 22 16:13:25 np0005592767 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 22 16:13:25 np0005592767 kernel: Run /init as init process
Jan 22 16:13:25 np0005592767 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 16:13:25 np0005592767 systemd: Detected virtualization kvm.
Jan 22 16:13:25 np0005592767 systemd: Detected architecture x86-64.
Jan 22 16:13:25 np0005592767 systemd: Running in initrd.
Jan 22 16:13:25 np0005592767 systemd: No hostname configured, using default hostname.
Jan 22 16:13:25 np0005592767 systemd: Hostname set to <localhost>.
Jan 22 16:13:25 np0005592767 systemd: Initializing machine ID from VM UUID.
Jan 22 16:13:25 np0005592767 systemd: Queued start job for default target Initrd Default Target.
Jan 22 16:13:25 np0005592767 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 16:13:25 np0005592767 systemd: Reached target Local Encrypted Volumes.
Jan 22 16:13:25 np0005592767 systemd: Reached target Initrd /usr File System.
Jan 22 16:13:25 np0005592767 systemd: Reached target Local File Systems.
Jan 22 16:13:25 np0005592767 systemd: Reached target Path Units.
Jan 22 16:13:25 np0005592767 systemd: Reached target Slice Units.
Jan 22 16:13:25 np0005592767 systemd: Reached target Swaps.
Jan 22 16:13:25 np0005592767 systemd: Reached target Timer Units.
Jan 22 16:13:25 np0005592767 systemd: Listening on D-Bus System Message Bus Socket.
Jan 22 16:13:25 np0005592767 systemd: Listening on Journal Socket (/dev/log).
Jan 22 16:13:25 np0005592767 systemd: Listening on Journal Socket.
Jan 22 16:13:25 np0005592767 systemd: Listening on udev Control Socket.
Jan 22 16:13:25 np0005592767 systemd: Listening on udev Kernel Socket.
Jan 22 16:13:25 np0005592767 systemd: Reached target Socket Units.
Jan 22 16:13:25 np0005592767 systemd: Starting Create List of Static Device Nodes...
Jan 22 16:13:25 np0005592767 systemd: Starting Journal Service...
Jan 22 16:13:25 np0005592767 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 16:13:25 np0005592767 systemd: Starting Apply Kernel Variables...
Jan 22 16:13:25 np0005592767 systemd: Starting Create System Users...
Jan 22 16:13:25 np0005592767 systemd: Starting Setup Virtual Console...
Jan 22 16:13:25 np0005592767 systemd: Finished Create List of Static Device Nodes.
Jan 22 16:13:25 np0005592767 systemd: Finished Apply Kernel Variables.
Jan 22 16:13:25 np0005592767 systemd: Finished Create System Users.
Jan 22 16:13:25 np0005592767 systemd-journald[306]: Journal started
Jan 22 16:13:25 np0005592767 systemd-journald[306]: Runtime Journal (/run/log/journal/094772d46a6e483898c5520e3f85ea8a) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:13:25 np0005592767 systemd-sysusers[311]: Creating group 'users' with GID 100.
Jan 22 16:13:25 np0005592767 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Jan 22 16:13:25 np0005592767 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 22 16:13:25 np0005592767 systemd: Started Journal Service.
Jan 22 16:13:25 np0005592767 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 16:13:25 np0005592767 systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 16:13:25 np0005592767 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 16:13:25 np0005592767 systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 16:13:25 np0005592767 systemd[1]: Finished Setup Virtual Console.
Jan 22 16:13:25 np0005592767 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 22 16:13:25 np0005592767 systemd[1]: Starting dracut cmdline hook...
Jan 22 16:13:25 np0005592767 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Jan 22 16:13:25 np0005592767 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 22 16:13:25 np0005592767 systemd[1]: Finished dracut cmdline hook.
Jan 22 16:13:25 np0005592767 systemd[1]: Starting dracut pre-udev hook...
Jan 22 16:13:26 np0005592767 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 22 16:13:26 np0005592767 kernel: device-mapper: uevent: version 1.0.3
Jan 22 16:13:26 np0005592767 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 22 16:13:26 np0005592767 kernel: RPC: Registered named UNIX socket transport module.
Jan 22 16:13:26 np0005592767 kernel: RPC: Registered udp transport module.
Jan 22 16:13:26 np0005592767 kernel: RPC: Registered tcp transport module.
Jan 22 16:13:26 np0005592767 kernel: RPC: Registered tcp-with-tls transport module.
Jan 22 16:13:26 np0005592767 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 22 16:13:26 np0005592767 rpc.statd[443]: Version 2.5.4 starting
Jan 22 16:13:26 np0005592767 rpc.statd[443]: Initializing NSM state
Jan 22 16:13:26 np0005592767 rpc.idmapd[448]: Setting log level to 0
Jan 22 16:13:26 np0005592767 systemd[1]: Finished dracut pre-udev hook.
Jan 22 16:13:26 np0005592767 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 16:13:26 np0005592767 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 16:13:26 np0005592767 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 16:13:26 np0005592767 systemd[1]: Starting dracut pre-trigger hook...
Jan 22 16:13:26 np0005592767 systemd[1]: Finished dracut pre-trigger hook.
Jan 22 16:13:26 np0005592767 systemd[1]: Starting Coldplug All udev Devices...
Jan 22 16:13:26 np0005592767 systemd[1]: Created slice Slice /system/modprobe.
Jan 22 16:13:26 np0005592767 systemd[1]: Starting Load Kernel Module configfs...
Jan 22 16:13:26 np0005592767 systemd[1]: Finished Coldplug All udev Devices.
Jan 22 16:13:26 np0005592767 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:13:26 np0005592767 systemd[1]: Finished Load Kernel Module configfs.
Jan 22 16:13:26 np0005592767 systemd[1]: Mounting Kernel Configuration File System...
Jan 22 16:13:26 np0005592767 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 16:13:26 np0005592767 systemd[1]: Reached target Network.
Jan 22 16:13:26 np0005592767 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 22 16:13:26 np0005592767 systemd[1]: Starting dracut initqueue hook...
Jan 22 16:13:26 np0005592767 systemd[1]: Mounted Kernel Configuration File System.
Jan 22 16:13:26 np0005592767 systemd[1]: Reached target System Initialization.
Jan 22 16:13:26 np0005592767 systemd[1]: Reached target Basic System.
Jan 22 16:13:26 np0005592767 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 22 16:13:26 np0005592767 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 22 16:13:26 np0005592767 kernel: vda: vda1
Jan 22 16:13:26 np0005592767 kernel: scsi host0: ata_piix
Jan 22 16:13:26 np0005592767 kernel: scsi host1: ata_piix
Jan 22 16:13:26 np0005592767 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 22 16:13:26 np0005592767 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 22 16:13:26 np0005592767 systemd-udevd[491]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:13:26 np0005592767 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 16:13:26 np0005592767 systemd[1]: Reached target Initrd Root Device.
Jan 22 16:13:26 np0005592767 kernel: ata1: found unknown device (class 0)
Jan 22 16:13:26 np0005592767 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 22 16:13:26 np0005592767 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 22 16:13:26 np0005592767 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 22 16:13:26 np0005592767 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 22 16:13:26 np0005592767 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 22 16:13:27 np0005592767 systemd[1]: Finished dracut initqueue hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Remote File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting dracut pre-mount hook...
Jan 22 16:13:27 np0005592767 systemd[1]: Finished dracut pre-mount hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 22 16:13:27 np0005592767 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Jan 22 16:13:27 np0005592767 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 22 16:13:27 np0005592767 systemd[1]: Mounting /sysroot...
Jan 22 16:13:27 np0005592767 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 22 16:13:27 np0005592767 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 22 16:13:27 np0005592767 kernel: XFS (vda1): Ending clean mount
Jan 22 16:13:27 np0005592767 systemd[1]: Mounted /sysroot.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Initrd Root File System.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 22 16:13:27 np0005592767 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Initrd File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Reached target Initrd Default Target.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting dracut mount hook...
Jan 22 16:13:27 np0005592767 systemd[1]: Finished dracut mount hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 22 16:13:27 np0005592767 rpc.idmapd[448]: exiting on signal 15
Jan 22 16:13:27 np0005592767 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Network.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Timer Units.
Jan 22 16:13:27 np0005592767 systemd[1]: dbus.socket: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 22 16:13:27 np0005592767 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Initrd Default Target.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Basic System.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Initrd Root Device.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Initrd /usr File System.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Path Units.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Remote File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Slice Units.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Socket Units.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target System Initialization.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Local File Systems.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Swaps.
Jan 22 16:13:27 np0005592767 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped dracut mount hook.
Jan 22 16:13:27 np0005592767 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped dracut pre-mount hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 22 16:13:27 np0005592767 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 22 16:13:27 np0005592767 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped dracut initqueue hook.
Jan 22 16:13:27 np0005592767 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 16:13:27 np0005592767 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 22 16:13:27 np0005592767 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped Coldplug All udev Devices.
Jan 22 16:13:27 np0005592767 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopped dracut pre-trigger hook.
Jan 22 16:13:27 np0005592767 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped Setup Virtual Console.
Jan 22 16:13:28 np0005592767 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 22 16:13:28 np0005592767 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Closed udev Control Socket.
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Closed udev Kernel Socket.
Jan 22 16:13:28 np0005592767 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped dracut pre-udev hook.
Jan 22 16:13:28 np0005592767 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped dracut cmdline hook.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Cleanup udev Database...
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 22 16:13:28 np0005592767 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Stopped Create System Users.
Jan 22 16:13:28 np0005592767 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Cleanup udev Database.
Jan 22 16:13:28 np0005592767 systemd[1]: Reached target Switch Root.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Switch Root...
Jan 22 16:13:28 np0005592767 systemd[1]: Switching root.
Jan 22 16:13:28 np0005592767 systemd-journald[306]: Journal stopped
Jan 22 16:13:28 np0005592767 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 22 16:13:28 np0005592767 kernel: audit: type=1404 audit(1769116408.160:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:13:28 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:13:28 np0005592767 kernel: audit: type=1403 audit(1769116408.303:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 22 16:13:28 np0005592767 systemd: Successfully loaded SELinux policy in 149.099ms.
Jan 22 16:13:28 np0005592767 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 29.259ms.
Jan 22 16:13:28 np0005592767 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 22 16:13:28 np0005592767 systemd: Detected virtualization kvm.
Jan 22 16:13:28 np0005592767 systemd: Detected architecture x86-64.
Jan 22 16:13:28 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:13:28 np0005592767 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd: Stopped Switch Root.
Jan 22 16:13:28 np0005592767 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 22 16:13:28 np0005592767 systemd: Created slice Slice /system/getty.
Jan 22 16:13:28 np0005592767 systemd: Created slice Slice /system/serial-getty.
Jan 22 16:13:28 np0005592767 systemd: Created slice Slice /system/sshd-keygen.
Jan 22 16:13:28 np0005592767 systemd: Created slice User and Session Slice.
Jan 22 16:13:28 np0005592767 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 22 16:13:28 np0005592767 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 22 16:13:28 np0005592767 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 22 16:13:28 np0005592767 systemd: Reached target Local Encrypted Volumes.
Jan 22 16:13:28 np0005592767 systemd: Stopped target Switch Root.
Jan 22 16:13:28 np0005592767 systemd: Stopped target Initrd File Systems.
Jan 22 16:13:28 np0005592767 systemd: Stopped target Initrd Root File System.
Jan 22 16:13:28 np0005592767 systemd: Reached target Local Integrity Protected Volumes.
Jan 22 16:13:28 np0005592767 systemd: Reached target Path Units.
Jan 22 16:13:28 np0005592767 systemd: Reached target rpc_pipefs.target.
Jan 22 16:13:28 np0005592767 systemd: Reached target Slice Units.
Jan 22 16:13:28 np0005592767 systemd: Reached target Swaps.
Jan 22 16:13:28 np0005592767 systemd: Reached target Local Verity Protected Volumes.
Jan 22 16:13:28 np0005592767 systemd: Listening on RPCbind Server Activation Socket.
Jan 22 16:13:28 np0005592767 systemd: Reached target RPC Port Mapper.
Jan 22 16:13:28 np0005592767 systemd: Listening on Process Core Dump Socket.
Jan 22 16:13:28 np0005592767 systemd: Listening on initctl Compatibility Named Pipe.
Jan 22 16:13:28 np0005592767 systemd: Listening on udev Control Socket.
Jan 22 16:13:28 np0005592767 systemd: Listening on udev Kernel Socket.
Jan 22 16:13:28 np0005592767 systemd: Mounting Huge Pages File System...
Jan 22 16:13:28 np0005592767 systemd: Mounting POSIX Message Queue File System...
Jan 22 16:13:28 np0005592767 systemd: Mounting Kernel Debug File System...
Jan 22 16:13:28 np0005592767 systemd: Mounting Kernel Trace File System...
Jan 22 16:13:28 np0005592767 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 16:13:28 np0005592767 systemd: Starting Create List of Static Device Nodes...
Jan 22 16:13:28 np0005592767 systemd: Starting Load Kernel Module configfs...
Jan 22 16:13:28 np0005592767 systemd: Starting Load Kernel Module drm...
Jan 22 16:13:28 np0005592767 systemd: Starting Load Kernel Module efi_pstore...
Jan 22 16:13:28 np0005592767 systemd: Starting Load Kernel Module fuse...
Jan 22 16:13:28 np0005592767 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 22 16:13:28 np0005592767 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd: Stopped File System Check on Root Device.
Jan 22 16:13:28 np0005592767 systemd: Stopped Journal Service.
Jan 22 16:13:28 np0005592767 kernel: fuse: init (API version 7.37)
Jan 22 16:13:28 np0005592767 systemd: Starting Journal Service...
Jan 22 16:13:28 np0005592767 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 22 16:13:28 np0005592767 systemd: Starting Generate network units from Kernel command line...
Jan 22 16:13:28 np0005592767 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:13:28 np0005592767 systemd: Starting Remount Root and Kernel File Systems...
Jan 22 16:13:28 np0005592767 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 22 16:13:28 np0005592767 systemd: Starting Apply Kernel Variables...
Jan 22 16:13:28 np0005592767 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 22 16:13:28 np0005592767 systemd: Starting Coldplug All udev Devices...
Jan 22 16:13:28 np0005592767 systemd: Mounted Huge Pages File System.
Jan 22 16:13:28 np0005592767 systemd: Mounted POSIX Message Queue File System.
Jan 22 16:13:28 np0005592767 systemd: Mounted Kernel Debug File System.
Jan 22 16:13:28 np0005592767 systemd: Mounted Kernel Trace File System.
Jan 22 16:13:28 np0005592767 systemd: Finished Create List of Static Device Nodes.
Jan 22 16:13:28 np0005592767 systemd: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd: Finished Load Kernel Module configfs.
Jan 22 16:13:28 np0005592767 systemd-journald[681]: Journal started
Jan 22 16:13:28 np0005592767 systemd-journald[681]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:13:28 np0005592767 systemd[1]: Queued start job for default target Multi-User System.
Jan 22 16:13:28 np0005592767 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd: Started Journal Service.
Jan 22 16:13:28 np0005592767 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 22 16:13:28 np0005592767 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Load Kernel Module fuse.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 22 16:13:28 np0005592767 kernel: ACPI: bus type drm_connector registered
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Generate network units from Kernel command line.
Jan 22 16:13:28 np0005592767 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Load Kernel Module drm.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Apply Kernel Variables.
Jan 22 16:13:28 np0005592767 systemd[1]: Mounting FUSE Control File System...
Jan 22 16:13:28 np0005592767 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Rebuild Hardware Database...
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 22 16:13:28 np0005592767 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Load/Save OS Random Seed...
Jan 22 16:13:28 np0005592767 systemd-journald[681]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 22 16:13:28 np0005592767 systemd-journald[681]: Received client request to flush runtime journal.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Create System Users...
Jan 22 16:13:28 np0005592767 systemd[1]: Mounted FUSE Control File System.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Load/Save OS Random Seed.
Jan 22 16:13:28 np0005592767 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Create System Users.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Coldplug All udev Devices.
Jan 22 16:13:28 np0005592767 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 22 16:13:28 np0005592767 systemd[1]: Reached target Preparation for Local File Systems.
Jan 22 16:13:28 np0005592767 systemd[1]: Reached target Local File Systems.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 22 16:13:28 np0005592767 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 22 16:13:28 np0005592767 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 22 16:13:28 np0005592767 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Automatic Boot Loader Update...
Jan 22 16:13:28 np0005592767 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 22 16:13:28 np0005592767 systemd[1]: Starting Create Volatile Files and Directories...
Jan 22 16:13:29 np0005592767 bootctl[701]: Couldn't find EFI system partition, skipping.
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Automatic Boot Loader Update.
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Create Volatile Files and Directories.
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Security Auditing Service...
Jan 22 16:13:29 np0005592767 systemd[1]: Starting RPC Bind...
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Rebuild Journal Catalog...
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 22 16:13:29 np0005592767 auditd[707]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 22 16:13:29 np0005592767 auditd[707]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 22 16:13:29 np0005592767 systemd[1]: Started RPC Bind.
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Rebuild Journal Catalog.
Jan 22 16:13:29 np0005592767 augenrules[712]: /sbin/augenrules: No change
Jan 22 16:13:29 np0005592767 augenrules[727]: No rules
Jan 22 16:13:29 np0005592767 augenrules[727]: enabled 1
Jan 22 16:13:29 np0005592767 augenrules[727]: failure 1
Jan 22 16:13:29 np0005592767 augenrules[727]: pid 707
Jan 22 16:13:29 np0005592767 augenrules[727]: rate_limit 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_limit 8192
Jan 22 16:13:29 np0005592767 augenrules[727]: lost 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog 1
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time 60000
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time_actual 0
Jan 22 16:13:29 np0005592767 augenrules[727]: enabled 1
Jan 22 16:13:29 np0005592767 augenrules[727]: failure 1
Jan 22 16:13:29 np0005592767 augenrules[727]: pid 707
Jan 22 16:13:29 np0005592767 augenrules[727]: rate_limit 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_limit 8192
Jan 22 16:13:29 np0005592767 augenrules[727]: lost 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog 2
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time 60000
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time_actual 0
Jan 22 16:13:29 np0005592767 augenrules[727]: enabled 1
Jan 22 16:13:29 np0005592767 augenrules[727]: failure 1
Jan 22 16:13:29 np0005592767 augenrules[727]: pid 707
Jan 22 16:13:29 np0005592767 augenrules[727]: rate_limit 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_limit 8192
Jan 22 16:13:29 np0005592767 augenrules[727]: lost 0
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog 3
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time 60000
Jan 22 16:13:29 np0005592767 augenrules[727]: backlog_wait_time_actual 0
Jan 22 16:13:29 np0005592767 systemd[1]: Started Security Auditing Service.
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Rebuild Hardware Database.
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Update is Completed...
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Update is Completed.
Jan 22 16:13:29 np0005592767 systemd-udevd[735]: Using default interface naming scheme 'rhel-9.0'.
Jan 22 16:13:29 np0005592767 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 22 16:13:29 np0005592767 systemd[1]: Starting Load Kernel Module configfs...
Jan 22 16:13:29 np0005592767 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 22 16:13:29 np0005592767 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 22 16:13:29 np0005592767 systemd[1]: Finished Load Kernel Module configfs.
Jan 22 16:13:29 np0005592767 systemd[1]: Reached target System Initialization.
Jan 22 16:13:29 np0005592767 systemd[1]: Started dnf makecache --timer.
Jan 22 16:13:29 np0005592767 systemd[1]: Started Daily rotation of log files.
Jan 22 16:13:29 np0005592767 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 22 16:13:29 np0005592767 systemd[1]: Reached target Timer Units.
Jan 22 16:13:29 np0005592767 systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:13:29 np0005592767 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 22 16:13:29 np0005592767 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 22 16:13:29 np0005592767 systemd[1]: Reached target Socket Units.
Jan 22 16:13:30 np0005592767 systemd[1]: Starting D-Bus System Message Bus...
Jan 22 16:13:30 np0005592767 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:13:30 np0005592767 systemd[1]: Started D-Bus System Message Bus.
Jan 22 16:13:30 np0005592767 systemd[1]: Reached target Basic System.
Jan 22 16:13:30 np0005592767 dbus-broker-lau[778]: Ready
Jan 22 16:13:30 np0005592767 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 22 16:13:30 np0005592767 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 22 16:13:30 np0005592767 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 22 16:13:30 np0005592767 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 22 16:13:30 np0005592767 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 22 16:13:30 np0005592767 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 22 16:13:30 np0005592767 kernel: Console: switching to colour dummy device 80x25
Jan 22 16:13:30 np0005592767 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 22 16:13:30 np0005592767 kernel: [drm] features: -context_init
Jan 22 16:13:30 np0005592767 systemd[1]: Starting NTP client/server...
Jan 22 16:13:30 np0005592767 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 22 16:13:30 np0005592767 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 22 16:13:30 np0005592767 chronyd[789]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 16:13:30 np0005592767 chronyd[789]: Loaded 0 symmetric keys
Jan 22 16:13:30 np0005592767 systemd[1]: Starting IPv4 firewall with iptables...
Jan 22 16:13:30 np0005592767 chronyd[789]: Using right/UTC timezone to obtain leap second data
Jan 22 16:13:30 np0005592767 chronyd[789]: Loaded seccomp filter (level 2)
Jan 22 16:13:30 np0005592767 systemd[1]: Started irqbalance daemon.
Jan 22 16:13:30 np0005592767 kernel: [drm] number of scanouts: 1
Jan 22 16:13:30 np0005592767 kernel: [drm] number of cap sets: 0
Jan 22 16:13:30 np0005592767 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 22 16:13:30 np0005592767 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:13:30 np0005592767 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:13:30 np0005592767 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 16:13:30 np0005592767 systemd[1]: Reached target sshd-keygen.target.
Jan 22 16:13:30 np0005592767 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 22 16:13:30 np0005592767 systemd[1]: Reached target User and Group Name Lookups.
Jan 22 16:13:30 np0005592767 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 22 16:13:30 np0005592767 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 22 16:13:30 np0005592767 kernel: Console: switching to colour frame buffer device 128x48
Jan 22 16:13:30 np0005592767 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 22 16:13:30 np0005592767 systemd[1]: Starting User Login Management...
Jan 22 16:13:30 np0005592767 systemd[1]: Started NTP client/server.
Jan 22 16:13:30 np0005592767 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 22 16:13:30 np0005592767 kernel: kvm_amd: TSC scaling supported
Jan 22 16:13:30 np0005592767 kernel: kvm_amd: Nested Virtualization enabled
Jan 22 16:13:30 np0005592767 kernel: kvm_amd: Nested Paging enabled
Jan 22 16:13:30 np0005592767 kernel: kvm_amd: LBR virtualization supported
Jan 22 16:13:30 np0005592767 systemd-logind[802]: New seat seat0.
Jan 22 16:13:30 np0005592767 systemd-logind[802]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 16:13:30 np0005592767 systemd-logind[802]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 16:13:30 np0005592767 systemd[1]: Started User Login Management.
Jan 22 16:13:30 np0005592767 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 22 16:13:30 np0005592767 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 22 16:13:30 np0005592767 iptables.init[790]: iptables: Applying firewall rules: [  OK  ]
Jan 22 16:13:30 np0005592767 systemd[1]: Finished IPv4 firewall with iptables.
Jan 22 16:13:30 np0005592767 cloud-init[843]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 22 Jan 2026 21:13:30 +0000. Up 7.26 seconds.
Jan 22 16:13:30 np0005592767 systemd[1]: run-cloud\x2dinit-tmp-tmpdzbu_s8z.mount: Deactivated successfully.
Jan 22 16:13:30 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 16:13:30 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 16:13:30 np0005592767 systemd-hostnamed[857]: Hostname set to <np0005592767.novalocal> (static)
Jan 22 16:13:31 np0005592767 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 22 16:13:31 np0005592767 systemd[1]: Reached target Preparation for Network.
Jan 22 16:13:31 np0005592767 systemd[1]: Starting Network Manager...
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1842] NetworkManager (version 1.54.3-2.el9) is starting... (boot:339c1445-6b44-44ff-b543-d72e4d6762b9)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1848] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1930] manager[0x56076c9b4000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1966] hostname: hostname: using hostnamed
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1968] hostname: static hostname changed from (none) to "np0005592767.novalocal"
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.1974] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2094] manager[0x56076c9b4000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2095] manager[0x56076c9b4000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2139] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2140] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2140] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2141] manager: Networking is enabled by state file
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2143] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2152] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2177] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2191] dhcp: init: Using DHCP client 'internal'
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2194] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:13:31 np0005592767 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2208] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2238] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2247] device (lo): Activation: starting connection 'lo' (7e49d025-cbad-4eaf-af6a-04deb1852570)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2257] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2260] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2294] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2299] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2302] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2304] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2306] device (eth0): carrier: link connected
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2310] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2317] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2325] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2330] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2331] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2334] manager: NetworkManager state is now CONNECTING
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2336] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2344] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2348] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:13:31 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2414] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Jan 22 16:13:31 np0005592767 systemd[1]: Started Network Manager.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2443] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:13:31 np0005592767 systemd[1]: Reached target Network.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2490] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2541] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2544] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2552] device (lo): Activation: successful, device activated.
Jan 22 16:13:31 np0005592767 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2596] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2598] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2603] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2605] device (eth0): Activation: successful, device activated.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2612] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:13:31 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:13:31 np0005592767 NetworkManager[862]: <info>  [1769116411.2614] manager: startup complete
Jan 22 16:13:31 np0005592767 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 22 16:13:31 np0005592767 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 22 16:13:31 np0005592767 systemd[1]: Reached target NFS client services.
Jan 22 16:13:31 np0005592767 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 22 16:13:31 np0005592767 systemd[1]: Reached target Remote File Systems.
Jan 22 16:13:31 np0005592767 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 22 16:13:31 np0005592767 systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:13:31 np0005592767 systemd[1]: Starting Cloud-init: Network Stage...
Jan 22 16:13:31 np0005592767 cloud-init[926]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 22 Jan 2026 21:13:31 +0000. Up 8.25 seconds.
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |  eth0  | True |        38.102.83.132         | 255.255.255.0 | global | fa:16:3e:d0:1f:eb |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |  eth0  | True | fe80::f816:3eff:fed0:1feb/64 |       .       |  link  | fa:16:3e:d0:1f:eb |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 22 16:13:31 np0005592767 cloud-init[926]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 22 16:13:32 np0005592767 cloud-init[926]: Generating public/private rsa key pair.
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key fingerprint is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: SHA256:ZbrIyibs9s3uVsgseC+vpVn+ouGBpJTsFWxKfmj91xw root@np0005592767.novalocal
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key's randomart image is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: +---[RSA 3072]----+
Jan 22 16:13:32 np0005592767 cloud-init[926]: |                 |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |  .              |
Jan 22 16:13:32 np0005592767 cloud-init[926]: | . +      o      |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |+ * .    +       |
Jan 22 16:13:32 np0005592767 cloud-init[926]: | O.= o .SE       |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |+o+.+.+.+..      |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |.o..o+=o.o       |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |  +oo%=          |
Jan 22 16:13:32 np0005592767 cloud-init[926]: | o.+XOBo.        |
Jan 22 16:13:32 np0005592767 cloud-init[926]: +----[SHA256]-----+
Jan 22 16:13:32 np0005592767 cloud-init[926]: Generating public/private ecdsa key pair.
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key fingerprint is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: SHA256:uB3otap17s4bEC/ZTyt107T3+0uvAldu4xahCl2S60M root@np0005592767.novalocal
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key's randomart image is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: +---[ECDSA 256]---+
Jan 22 16:13:32 np0005592767 cloud-init[926]: |                 |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |                 |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |      .    .  .  |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |       B  o .oo. |
Jan 22 16:13:32 np0005592767 cloud-init[926]: |      * S.o+o+o..|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |     . *.*Eoo.=..|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |      + *+o+ o +.|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |     . = o+ . + o|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |    ...o*. . o.+=|
Jan 22 16:13:32 np0005592767 cloud-init[926]: +----[SHA256]-----+
Jan 22 16:13:32 np0005592767 cloud-init[926]: Generating public/private ed25519 key pair.
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 22 16:13:32 np0005592767 cloud-init[926]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key fingerprint is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: SHA256:xdrh6aY5+rq/fc27qJXXt/Ca3JVjdJlQvQfoFjVBQI8 root@np0005592767.novalocal
Jan 22 16:13:32 np0005592767 cloud-init[926]: The key's randomart image is:
Jan 22 16:13:32 np0005592767 cloud-init[926]: +--[ED25519 256]--+
Jan 22 16:13:32 np0005592767 cloud-init[926]: |           .o==o.|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |         .  ooo..|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |          +.Eo...|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |         = oo ..+|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |        S +.   +o|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |         .   ...o|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |          o o+.++|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |        .= .ooO =|
Jan 22 16:13:32 np0005592767 cloud-init[926]: |      +**ooo.++* |
Jan 22 16:13:32 np0005592767 cloud-init[926]: +----[SHA256]-----+
Jan 22 16:13:32 np0005592767 systemd[1]: Finished Cloud-init: Network Stage.
Jan 22 16:13:32 np0005592767 systemd[1]: Reached target Cloud-config availability.
Jan 22 16:13:32 np0005592767 systemd[1]: Reached target Network is Online.
Jan 22 16:13:32 np0005592767 systemd[1]: Starting Cloud-init: Config Stage...
Jan 22 16:13:32 np0005592767 systemd[1]: Starting Crash recovery kernel arming...
Jan 22 16:13:32 np0005592767 systemd[1]: Starting Notify NFS peers of a restart...
Jan 22 16:13:32 np0005592767 systemd[1]: Starting System Logging Service...
Jan 22 16:13:32 np0005592767 sm-notify[1008]: Version 2.5.4 starting
Jan 22 16:13:32 np0005592767 systemd[1]: Starting OpenSSH server daemon...
Jan 22 16:13:32 np0005592767 systemd[1]: Starting Permit User Sessions...
Jan 22 16:13:32 np0005592767 systemd[1]: Started Notify NFS peers of a restart.
Jan 22 16:13:32 np0005592767 systemd[1]: Finished Permit User Sessions.
Jan 22 16:13:32 np0005592767 systemd[1]: Started Command Scheduler.
Jan 22 16:13:33 np0005592767 rsyslogd[1009]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1009" x-info="https://www.rsyslog.com"] start
Jan 22 16:13:33 np0005592767 rsyslogd[1009]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 22 16:13:33 np0005592767 systemd[1]: Started Getty on tty1.
Jan 22 16:13:33 np0005592767 systemd[1]: Started Serial Getty on ttyS0.
Jan 22 16:13:33 np0005592767 systemd[1]: Reached target Login Prompts.
Jan 22 16:13:33 np0005592767 systemd[1]: Started OpenSSH server daemon.
Jan 22 16:13:33 np0005592767 systemd[1]: Started System Logging Service.
Jan 22 16:13:33 np0005592767 systemd[1]: Reached target Multi-User System.
Jan 22 16:13:33 np0005592767 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 22 16:13:33 np0005592767 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 22 16:13:33 np0005592767 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 22 16:13:33 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:13:33 np0005592767 kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Jan 22 16:13:33 np0005592767 kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 22 16:13:33 np0005592767 cloud-init[1149]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 22 Jan 2026 21:13:33 +0000. Up 9.88 seconds.
Jan 22 16:13:33 np0005592767 systemd[1]: Finished Cloud-init: Config Stage.
Jan 22 16:13:33 np0005592767 systemd[1]: Starting Cloud-init: Final Stage...
Jan 22 16:13:33 np0005592767 dracut[1270]: dracut-057-102.git20250818.el9
Jan 22 16:13:33 np0005592767 cloud-init[1288]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 22 Jan 2026 21:13:33 +0000. Up 10.25 seconds.
Jan 22 16:13:33 np0005592767 cloud-init[1299]: #############################################################
Jan 22 16:13:33 np0005592767 dracut[1272]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 22 16:13:33 np0005592767 cloud-init[1301]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 22 16:13:33 np0005592767 cloud-init[1309]: 256 SHA256:uB3otap17s4bEC/ZTyt107T3+0uvAldu4xahCl2S60M root@np0005592767.novalocal (ECDSA)
Jan 22 16:13:33 np0005592767 cloud-init[1315]: 256 SHA256:xdrh6aY5+rq/fc27qJXXt/Ca3JVjdJlQvQfoFjVBQI8 root@np0005592767.novalocal (ED25519)
Jan 22 16:13:33 np0005592767 cloud-init[1320]: 3072 SHA256:ZbrIyibs9s3uVsgseC+vpVn+ouGBpJTsFWxKfmj91xw root@np0005592767.novalocal (RSA)
Jan 22 16:13:33 np0005592767 cloud-init[1325]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 22 16:13:33 np0005592767 cloud-init[1326]: #############################################################
Jan 22 16:13:33 np0005592767 cloud-init[1288]: Cloud-init v. 24.4-8.el9 finished at Thu, 22 Jan 2026 21:13:33 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.41 seconds
Jan 22 16:13:33 np0005592767 systemd[1]: Finished Cloud-init: Final Stage.
Jan 22 16:13:33 np0005592767 systemd[1]: Reached target Cloud-init target.
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: memstrack is not available
Jan 22 16:13:34 np0005592767 dracut[1272]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 22 16:13:34 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 22 16:13:35 np0005592767 dracut[1272]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 22 16:13:35 np0005592767 dracut[1272]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 22 16:13:35 np0005592767 dracut[1272]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 22 16:13:35 np0005592767 dracut[1272]: memstrack is not available
Jan 22 16:13:35 np0005592767 dracut[1272]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 22 16:13:35 np0005592767 dracut[1272]: *** Including module: systemd ***
Jan 22 16:13:35 np0005592767 dracut[1272]: *** Including module: fips ***
Jan 22 16:13:35 np0005592767 dracut[1272]: *** Including module: systemd-initrd ***
Jan 22 16:13:35 np0005592767 dracut[1272]: *** Including module: i18n ***
Jan 22 16:13:35 np0005592767 dracut[1272]: *** Including module: drm ***
Jan 22 16:13:36 np0005592767 chronyd[789]: Selected source 206.108.0.132 (2.centos.pool.ntp.org)
Jan 22 16:13:36 np0005592767 chronyd[789]: System clock TAI offset set to 37 seconds
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: prefixdevname ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: kernel-modules ***
Jan 22 16:13:36 np0005592767 kernel: block vda: the capability attribute has been deprecated.
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: kernel-modules-extra ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: qemu ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: fstab-sys ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: rootfs-block ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: terminfo ***
Jan 22 16:13:36 np0005592767 dracut[1272]: *** Including module: udev-rules ***
Jan 22 16:13:37 np0005592767 dracut[1272]: Skipping udev rule: 91-permissions.rules
Jan 22 16:13:37 np0005592767 dracut[1272]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: virtiofs ***
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: dracut-systemd ***
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: usrmount ***
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: base ***
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: fs-lib ***
Jan 22 16:13:37 np0005592767 dracut[1272]: *** Including module: kdumpbase ***
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 22 16:13:38 np0005592767 dracut[1272]:  microcode_ctl module: mangling fw_dir
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 22 16:13:38 np0005592767 dracut[1272]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Including module: openssl ***
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Including module: shutdown ***
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Including module: squash ***
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Including modules done ***
Jan 22 16:13:38 np0005592767 dracut[1272]: *** Installing kernel module dependencies ***
Jan 22 16:13:39 np0005592767 dracut[1272]: *** Installing kernel module dependencies done ***
Jan 22 16:13:39 np0005592767 dracut[1272]: *** Resolving executable dependencies ***
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 25 affinity is now unmanaged
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 31 affinity is now unmanaged
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 28 affinity is now unmanaged
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 32 affinity is now unmanaged
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 30 affinity is now unmanaged
Jan 22 16:13:40 np0005592767 irqbalance[797]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 22 16:13:40 np0005592767 irqbalance[797]: IRQ 29 affinity is now unmanaged
Jan 22 16:13:41 np0005592767 dracut[1272]: *** Resolving executable dependencies done ***
Jan 22 16:13:41 np0005592767 dracut[1272]: *** Generating early-microcode cpio image ***
Jan 22 16:13:41 np0005592767 dracut[1272]: *** Store current command line parameters ***
Jan 22 16:13:41 np0005592767 dracut[1272]: Stored kernel commandline:
Jan 22 16:13:41 np0005592767 dracut[1272]: No dracut internal kernel commandline stored in the initramfs
Jan 22 16:13:41 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:13:41 np0005592767 dracut[1272]: *** Install squash loader ***
Jan 22 16:13:42 np0005592767 dracut[1272]: *** Squashing the files inside the initramfs ***
Jan 22 16:13:43 np0005592767 dracut[1272]: *** Squashing the files inside the initramfs done ***
Jan 22 16:13:43 np0005592767 dracut[1272]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 22 16:13:43 np0005592767 dracut[1272]: *** Hardlinking files ***
Jan 22 16:13:43 np0005592767 dracut[1272]: *** Hardlinking files done ***
Jan 22 16:13:43 np0005592767 dracut[1272]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 22 16:13:45 np0005592767 kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Jan 22 16:13:45 np0005592767 kdumpctl[1021]: kdump: Starting kdump: [OK]
Jan 22 16:13:45 np0005592767 systemd[1]: Finished Crash recovery kernel arming.
Jan 22 16:13:45 np0005592767 systemd[1]: Startup finished in 1.599s (kernel) + 3.188s (initrd) + 16.891s (userspace) = 21.679s.
Jan 22 16:13:50 np0005592767 systemd[1]: Created slice User Slice of UID 1000.
Jan 22 16:13:50 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 22 16:13:50 np0005592767 systemd-logind[802]: New session 1 of user zuul.
Jan 22 16:13:50 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 22 16:13:50 np0005592767 systemd[1]: Starting User Manager for UID 1000...
Jan 22 16:13:50 np0005592767 systemd[4310]: Queued start job for default target Main User Target.
Jan 22 16:13:50 np0005592767 systemd[4310]: Created slice User Application Slice.
Jan 22 16:13:50 np0005592767 systemd[4310]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 16:13:50 np0005592767 systemd[4310]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 16:13:50 np0005592767 systemd[4310]: Reached target Paths.
Jan 22 16:13:50 np0005592767 systemd[4310]: Reached target Timers.
Jan 22 16:13:50 np0005592767 systemd[4310]: Starting D-Bus User Message Bus Socket...
Jan 22 16:13:50 np0005592767 systemd[4310]: Starting Create User's Volatile Files and Directories...
Jan 22 16:13:50 np0005592767 systemd[4310]: Listening on D-Bus User Message Bus Socket.
Jan 22 16:13:50 np0005592767 systemd[4310]: Reached target Sockets.
Jan 22 16:13:50 np0005592767 systemd[4310]: Finished Create User's Volatile Files and Directories.
Jan 22 16:13:50 np0005592767 systemd[4310]: Reached target Basic System.
Jan 22 16:13:50 np0005592767 systemd[4310]: Reached target Main User Target.
Jan 22 16:13:50 np0005592767 systemd[4310]: Startup finished in 129ms.
Jan 22 16:13:50 np0005592767 systemd[1]: Started User Manager for UID 1000.
Jan 22 16:13:50 np0005592767 systemd[1]: Started Session 1 of User zuul.
Jan 22 16:13:51 np0005592767 python3[4392]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:14:01 np0005592767 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:14:04 np0005592767 python3[4422]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:14:10 np0005592767 python3[4480]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:14:11 np0005592767 python3[4520]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 22 16:14:14 np0005592767 python3[4546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMeBm6NiIvVxetJ4FCT6X6eqh4y+XGlxa2O9SnCM87Rvd3hWwicZru82vfPJxoAbseaSfdfZfa5Oaf5dhIru0B1DVPR+Y21uBaSUcO1K8p5tC2tzE5lkAIy/kRVSwMYfC5dEpExJw+20uHDLhVtsOAMKhThwm/XsS/9yy8cQG4ADfn2gl1nOfLWdeDsMspTuwYbF5uu9ANML8AymvhI9P057RIvVPP3XADkxcthWmeoY31Rv8JlJGXn9R9yr9bXjaXt1WnmADIMvCooPBtjoHFkzec9uGiq2KbPxYijX4nkBK7VCl+z7mv0qda4ub0iuJwaz74mccey9rlhgqsbW68VK8P5ok/O5AYo7MrOUCGbNrU9JgXrMTk2Iu7TMLxDuT0VdEs8Q2UG15+ASQiyG6zYkOCJ02VjwHLQyQ73PJXkt2gFQHX1iBFOvYo2QMz4/MD4kAU/TCfKhXngyzI4H7PhTJJ3yrwNrOT4XzOSSfMhvBlszNp33r0FR4w0Oh/ssE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:14 np0005592767 python3[4570]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:14 np0005592767 python3[4669]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:15 np0005592767 python3[4740]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116454.6326194-254-173505891188848/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=e08bc08cd036420dba87014ca8d05e85_id_rsa follow=False checksum=f2ae46c25e13ca69ac26d9c69f9985b01dd0fed0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:15 np0005592767 python3[4863]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:16 np0005592767 python3[4934]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116455.6148803-308-274350830562801/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=e08bc08cd036420dba87014ca8d05e85_id_rsa.pub follow=False checksum=9abcadb7ea942e24eaeef4fb6992f6baca8c0b83 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:17 np0005592767 python3[4982]: ansible-ping Invoked with data=pong
Jan 22 16:14:18 np0005592767 python3[5006]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:14:20 np0005592767 python3[5064]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 22 16:14:21 np0005592767 python3[5096]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:21 np0005592767 python3[5120]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:22 np0005592767 python3[5144]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:22 np0005592767 python3[5168]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:22 np0005592767 python3[5192]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:22 np0005592767 python3[5216]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:24 np0005592767 python3[5242]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:25 np0005592767 python3[5320]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:25 np0005592767 python3[5393]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116464.6765313-35-181603512879485/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:26 np0005592767 python3[5441]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:26 np0005592767 python3[5465]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:26 np0005592767 python3[5489]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:27 np0005592767 python3[5513]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:27 np0005592767 python3[5537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:27 np0005592767 python3[5561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:28 np0005592767 python3[5585]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:28 np0005592767 python3[5609]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:28 np0005592767 python3[5633]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:28 np0005592767 python3[5657]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:29 np0005592767 python3[5681]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:29 np0005592767 python3[5705]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:29 np0005592767 python3[5729]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:30 np0005592767 python3[5753]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:30 np0005592767 python3[5777]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:30 np0005592767 python3[5801]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:30 np0005592767 python3[5825]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:31 np0005592767 python3[5849]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:31 np0005592767 python3[5873]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:31 np0005592767 python3[5897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:32 np0005592767 python3[5921]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:32 np0005592767 python3[5945]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:32 np0005592767 python3[5969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:32 np0005592767 python3[5993]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:33 np0005592767 python3[6017]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:33 np0005592767 python3[6041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:14:35 np0005592767 python3[6067]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 16:14:35 np0005592767 systemd[1]: Starting Time & Date Service...
Jan 22 16:14:35 np0005592767 systemd[1]: Started Time & Date Service.
Jan 22 16:14:35 np0005592767 systemd-timedated[6069]: Changed time zone to 'UTC' (UTC).
Jan 22 16:14:36 np0005592767 python3[6098]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:37 np0005592767 python3[6174]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:37 np0005592767 python3[6245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769116476.9159467-254-82188275737732/source _original_basename=tmp3s1eqyoi follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:38 np0005592767 python3[6345]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:38 np0005592767 python3[6416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769116477.8591413-305-14651482269063/source _original_basename=tmpwhrz9u07 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:39 np0005592767 python3[6518]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:39 np0005592767 python3[6591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769116478.9660885-384-17251173979803/source _original_basename=tmp5munc9j4 follow=False checksum=2e193f101b911db5e638a5fc33120ba1c99c8f88 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:40 np0005592767 python3[6639]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:14:40 np0005592767 python3[6665]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:14:40 np0005592767 python3[6745]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:14:41 np0005592767 python3[6818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116480.7450907-455-98932223300138/source _original_basename=tmpiiri2kz5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:14:41 np0005592767 python3[6869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-a621-2aaa-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:14:42 np0005592767 python3[6897]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-a621-2aaa-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 22 16:14:42 np0005592767 chronyd[789]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 22 16:14:43 np0005592767 python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:15:02 np0005592767 python3[6951]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:15:05 np0005592767 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 16:15:57 np0005592767 systemd[4310]: Starting Mark boot as successful...
Jan 22 16:15:57 np0005592767 systemd[4310]: Finished Mark boot as successful.
Jan 22 16:16:02 np0005592767 systemd-logind[802]: Session 1 logged out. Waiting for processes to exit.
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 22 16:16:25 np0005592767 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 22 16:16:25 np0005592767 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2630] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:16:25 np0005592767 systemd-udevd[6958]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2839] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2880] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2885] device (eth1): carrier: link connected
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2888] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2898] policy: auto-activating connection 'Wired connection 1' (42da23fc-7953-36b0-b86e-9cee7da143fd)
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2905] device (eth1): Activation: starting connection 'Wired connection 1' (42da23fc-7953-36b0-b86e-9cee7da143fd)
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2906] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2911] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2917] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:16:25 np0005592767 NetworkManager[862]: <info>  [1769116585.2925] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:16:25 np0005592767 systemd-logind[802]: New session 3 of user zuul.
Jan 22 16:16:25 np0005592767 systemd[1]: Started Session 3 of User zuul.
Jan 22 16:16:26 np0005592767 python3[6988]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-9ef2-05fc-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:16:33 np0005592767 python3[7068]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:16:33 np0005592767 python3[7141]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769116593.0016227-206-154175345750396/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=49b90886afeed50accf7769cf4aabc52554ecf66 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:16:34 np0005592767 python3[7191]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:16:34 np0005592767 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 16:16:34 np0005592767 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 16:16:34 np0005592767 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 16:16:34 np0005592767 systemd[1]: Stopping Network Manager...
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1426] caught SIGTERM, shutting down normally.
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1435] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1436] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1436] dhcp4 (eth0): state changed no lease
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1437] manager: NetworkManager state is now CONNECTING
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1582] dhcp4 (eth1): canceled DHCP transaction
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1583] dhcp4 (eth1): state changed no lease
Jan 22 16:16:34 np0005592767 NetworkManager[862]: <info>  [1769116594.1612] exiting (success)
Jan 22 16:16:34 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:16:34 np0005592767 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 16:16:34 np0005592767 systemd[1]: Stopped Network Manager.
Jan 22 16:16:34 np0005592767 systemd[1]: NetworkManager.service: Consumed 1.372s CPU time, 10.1M memory peak.
Jan 22 16:16:34 np0005592767 systemd[1]: Starting Network Manager...
Jan 22 16:16:34 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.2110] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:339c1445-6b44-44ff-b543-d72e4d6762b9)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.2114] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.2179] manager[0x559e67198000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:16:34 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 16:16:34 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3068] hostname: hostname: using hostnamed
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3069] hostname: static hostname changed from (none) to "np0005592767.novalocal"
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3078] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3086] manager[0x559e67198000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3087] manager[0x559e67198000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3140] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3141] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3142] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3143] manager: Networking is enabled by state file
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3147] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3155] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3217] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3240] dhcp: init: Using DHCP client 'internal'
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3245] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3256] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3266] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3281] device (lo): Activation: starting connection 'lo' (7e49d025-cbad-4eaf-af6a-04deb1852570)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3294] device (eth0): carrier: link connected
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3301] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3311] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3312] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3322] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3336] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3348] device (eth1): carrier: link connected
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3355] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3364] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (42da23fc-7953-36b0-b86e-9cee7da143fd) (indicated)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3365] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3375] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3387] device (eth1): Activation: starting connection 'Wired connection 1' (42da23fc-7953-36b0-b86e-9cee7da143fd)
Jan 22 16:16:34 np0005592767 systemd[1]: Started Network Manager.
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3399] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3408] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3414] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3419] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3424] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3433] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3438] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3445] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3453] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3469] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3474] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3489] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3495] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3521] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3529] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:16:34 np0005592767 NetworkManager[7198]: <info>  [1769116594.3540] device (lo): Activation: successful, device activated.
Jan 22 16:16:34 np0005592767 systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:16:34 np0005592767 python3[7256]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-9ef2-05fc-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0812] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0821] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0894] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0925] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0928] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0932] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0935] device (eth0): Activation: successful, device activated.
Jan 22 16:16:35 np0005592767 NetworkManager[7198]: <info>  [1769116595.0940] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:16:45 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:17:04 np0005592767 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.3659] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:17:19 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:17:19 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4021] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4024] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4030] device (eth1): Activation: successful, device activated.
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4037] manager: startup complete
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4040] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <warn>  [1769116639.4044] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4052] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4112] dhcp4 (eth1): canceled DHCP transaction
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4113] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4113] dhcp4 (eth1): state changed no lease
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4126] policy: auto-activating connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e)
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4130] device (eth1): Activation: starting connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e)
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4131] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4133] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4139] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4146] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4191] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4193] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:17:19 np0005592767 NetworkManager[7198]: <info>  [1769116639.4199] device (eth1): Activation: successful, device activated.
Jan 22 16:17:29 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:17:34 np0005592767 systemd[1]: session-3.scope: Deactivated successfully.
Jan 22 16:17:34 np0005592767 systemd[1]: session-3.scope: Consumed 1.549s CPU time.
Jan 22 16:17:34 np0005592767 systemd-logind[802]: Session 3 logged out. Waiting for processes to exit.
Jan 22 16:17:34 np0005592767 systemd-logind[802]: Removed session 3.
Jan 22 16:17:47 np0005592767 systemd-logind[802]: New session 4 of user zuul.
Jan 22 16:17:47 np0005592767 systemd[1]: Started Session 4 of User zuul.
Jan 22 16:17:48 np0005592767 python3[7387]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:17:48 np0005592767 python3[7460]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769116667.6946359-365-275467487384452/source _original_basename=tmphwaizw8d follow=False checksum=dd64a3b92ffda34adfd43ce03aa34a851854b0c5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:17:50 np0005592767 systemd[1]: session-4.scope: Deactivated successfully.
Jan 22 16:17:50 np0005592767 systemd-logind[802]: Session 4 logged out. Waiting for processes to exit.
Jan 22 16:17:50 np0005592767 systemd-logind[802]: Removed session 4.
Jan 22 16:18:57 np0005592767 systemd[4310]: Created slice User Background Tasks Slice.
Jan 22 16:18:57 np0005592767 systemd[4310]: Starting Cleanup of User's Temporary Files and Directories...
Jan 22 16:18:57 np0005592767 systemd[4310]: Finished Cleanup of User's Temporary Files and Directories.
Jan 22 16:23:50 np0005592767 systemd-logind[802]: New session 5 of user zuul.
Jan 22 16:23:50 np0005592767 systemd[1]: Started Session 5 of User zuul.
Jan 22 16:23:50 np0005592767 python3[7524]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-c9eb-29a0-000000000ca8-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:23:50 np0005592767 python3[7552]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:51 np0005592767 python3[7579]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:51 np0005592767 python3[7605]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:51 np0005592767 python3[7631]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:52 np0005592767 python3[7657]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:52 np0005592767 python3[7735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:23:53 np0005592767 python3[7808]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769117032.4641523-368-258302514300593/source _original_basename=tmpdagh0vr5 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:23:54 np0005592767 python3[7858]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:23:54 np0005592767 systemd[1]: Reloading.
Jan 22 16:23:54 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:23:56 np0005592767 python3[7914]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 22 16:24:01 np0005592767 python3[7940]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:24:01 np0005592767 python3[7968]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:24:02 np0005592767 python3[7996]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:24:02 np0005592767 python3[8024]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:24:03 np0005592767 python3[8051]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-c9eb-29a0-000000000caf-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:24:04 np0005592767 python3[8081]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 22 16:24:07 np0005592767 systemd[1]: session-5.scope: Deactivated successfully.
Jan 22 16:24:07 np0005592767 systemd[1]: session-5.scope: Consumed 4.172s CPU time.
Jan 22 16:24:07 np0005592767 systemd-logind[802]: Session 5 logged out. Waiting for processes to exit.
Jan 22 16:24:07 np0005592767 systemd-logind[802]: Removed session 5.
Jan 22 16:24:09 np0005592767 systemd-logind[802]: New session 6 of user zuul.
Jan 22 16:24:09 np0005592767 systemd[1]: Started Session 6 of User zuul.
Jan 22 16:24:09 np0005592767 python3[8116]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 22 16:24:23 np0005592767 setsebool[8159]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 22 16:24:23 np0005592767 setsebool[8159]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 22 16:24:38 np0005592767 kernel: SELinux:  Converting 385 SID table entries...
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:24:38 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  Converting 388 SID table entries...
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:24:50 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:25:10 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 16:25:10 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:25:10 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:25:10 np0005592767 systemd[1]: Reloading.
Jan 22 16:25:10 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:25:10 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:25:20 np0005592767 python3[15164]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f0c1-e77d-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:25:21 np0005592767 kernel: evm: overlay not supported
Jan 22 16:25:21 np0005592767 systemd[4310]: Starting D-Bus User Message Bus...
Jan 22 16:25:21 np0005592767 dbus-broker-launch[15691]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 22 16:25:21 np0005592767 dbus-broker-launch[15691]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 22 16:25:21 np0005592767 systemd[4310]: Started D-Bus User Message Bus.
Jan 22 16:25:21 np0005592767 dbus-broker-lau[15691]: Ready
Jan 22 16:25:21 np0005592767 systemd[4310]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 22 16:25:21 np0005592767 systemd[4310]: Created slice Slice /user.
Jan 22 16:25:21 np0005592767 systemd[4310]: podman-15617.scope: unit configures an IP firewall, but not running as root.
Jan 22 16:25:21 np0005592767 systemd[4310]: (This warning is only shown for the first unit using IP firewalling.)
Jan 22 16:25:21 np0005592767 systemd[4310]: Started podman-15617.scope.
Jan 22 16:25:21 np0005592767 systemd[4310]: Started podman-pause-8af279ce.scope.
Jan 22 16:25:22 np0005592767 python3[16096]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.58:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.58:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:25:22 np0005592767 python3[16096]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 22 16:25:23 np0005592767 systemd[1]: session-6.scope: Deactivated successfully.
Jan 22 16:25:23 np0005592767 systemd[1]: session-6.scope: Consumed 48.512s CPU time.
Jan 22 16:25:23 np0005592767 systemd-logind[802]: Session 6 logged out. Waiting for processes to exit.
Jan 22 16:25:23 np0005592767 systemd-logind[802]: Removed session 6.
Jan 22 16:25:48 np0005592767 systemd-logind[802]: New session 7 of user zuul.
Jan 22 16:25:48 np0005592767 systemd[1]: Started Session 7 of User zuul.
Jan 22 16:25:48 np0005592767 python3[25226]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:25:49 np0005592767 python3[25405]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:25:50 np0005592767 python3[25778]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005592767.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 22 16:25:50 np0005592767 irqbalance[797]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 22 16:25:50 np0005592767 irqbalance[797]: IRQ 27 affinity is now unmanaged
Jan 22 16:25:50 np0005592767 python3[25993]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP1bVa3UyQJYrIfR/D9+QGd9nQU79HczsFvkjJ9aX9AI5by0DEm0Wt09iGLM9Lsl9RzwuYfXi/K5FkpVq6cEErY= zuul@np0005592764.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 22 16:25:51 np0005592767 python3[26248]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:25:51 np0005592767 python3[26532]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769117150.844039-171-86656854017064/source _original_basename=tmpylujmf3j follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:25:52 np0005592767 python3[26871]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 22 16:25:52 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 16:25:52 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 16:25:52 np0005592767 systemd-hostnamed[26983]: Changed pretty hostname to 'compute-2'
Jan 22 16:25:52 np0005592767 systemd-hostnamed[26983]: Hostname set to <compute-2> (static)
Jan 22 16:25:52 np0005592767 NetworkManager[7198]: <info>  [1769117152.6754] hostname: static hostname changed from "np0005592767.novalocal" to "compute-2"
Jan 22 16:25:52 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:25:52 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:25:53 np0005592767 systemd-logind[802]: Session 7 logged out. Waiting for processes to exit.
Jan 22 16:25:53 np0005592767 systemd[1]: session-7.scope: Deactivated successfully.
Jan 22 16:25:53 np0005592767 systemd[1]: session-7.scope: Consumed 2.353s CPU time.
Jan 22 16:25:53 np0005592767 systemd-logind[802]: Removed session 7.
Jan 22 16:26:00 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:26:00 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:26:00 np0005592767 systemd[1]: man-db-cache-update.service: Consumed 59.860s CPU time.
Jan 22 16:26:00 np0005592767 systemd[1]: run-rcd911edd18df42c18b6cd723f20e1674.service: Deactivated successfully.
Jan 22 16:26:02 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:26:22 np0005592767 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:28:47 np0005592767 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 22 16:28:47 np0005592767 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 22 16:28:47 np0005592767 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 22 16:28:47 np0005592767 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 22 16:30:26 np0005592767 systemd-logind[802]: New session 8 of user zuul.
Jan 22 16:30:26 np0005592767 systemd[1]: Started Session 8 of User zuul.
Jan 22 16:30:26 np0005592767 python3[30025]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:30:28 np0005592767 python3[30141]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:28 np0005592767 python3[30214]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:29 np0005592767 python3[30240]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:29 np0005592767 python3[30313]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:29 np0005592767 python3[30339]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:30 np0005592767 python3[30412]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:30 np0005592767 python3[30438]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:30 np0005592767 python3[30511]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:31 np0005592767 python3[30537]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:31 np0005592767 python3[30610]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:31 np0005592767 python3[30636]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:32 np0005592767 python3[30709]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:32 np0005592767 python3[30735]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 22 16:30:33 np0005592767 python3[30808]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769117428.1379926-34006-36844236896229/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:30:42 np0005592767 python3[30856]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:35:41 np0005592767 systemd[1]: session-8.scope: Deactivated successfully.
Jan 22 16:35:41 np0005592767 systemd[1]: session-8.scope: Consumed 5.536s CPU time.
Jan 22 16:35:41 np0005592767 systemd-logind[802]: Session 8 logged out. Waiting for processes to exit.
Jan 22 16:35:41 np0005592767 systemd-logind[802]: Removed session 8.
Jan 22 16:46:06 np0005592767 systemd-logind[802]: New session 9 of user zuul.
Jan 22 16:46:06 np0005592767 systemd[1]: Started Session 9 of User zuul.
Jan 22 16:46:07 np0005592767 python3.9[31040]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:46:08 np0005592767 python3.9[31221]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:25 np0005592767 systemd[1]: session-9.scope: Deactivated successfully.
Jan 22 16:46:25 np0005592767 systemd[1]: session-9.scope: Consumed 8.379s CPU time.
Jan 22 16:46:25 np0005592767 systemd-logind[802]: Session 9 logged out. Waiting for processes to exit.
Jan 22 16:46:25 np0005592767 systemd-logind[802]: Removed session 9.
Jan 22 16:46:41 np0005592767 systemd-logind[802]: New session 10 of user zuul.
Jan 22 16:46:41 np0005592767 systemd[1]: Started Session 10 of User zuul.
Jan 22 16:46:42 np0005592767 python3.9[31435]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 22 16:46:43 np0005592767 python3.9[31610]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:46:44 np0005592767 python3.9[31762]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:46:45 np0005592767 python3.9[31915]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:46:46 np0005592767 python3.9[32067]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:47 np0005592767 python3.9[32219]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:46:47 np0005592767 python3.9[32342]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118406.6853805-180-16732911965488/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:48 np0005592767 python3.9[32494]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:46:49 np0005592767 python3.9[32650]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:46:50 np0005592767 python3.9[32802]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:46:51 np0005592767 python3.9[32952]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:46:57 np0005592767 python3.9[33205]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:46:58 np0005592767 python3.9[33355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:46:59 np0005592767 python3.9[33509]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:47:00 np0005592767 python3.9[33667]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:47:01 np0005592767 python3.9[33751]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:48:38 np0005592767 systemd[1]: Reloading.
Jan 22 16:48:38 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:38 np0005592767 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 22 16:48:39 np0005592767 systemd[1]: Reloading.
Jan 22 16:48:39 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:39 np0005592767 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 22 16:48:39 np0005592767 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 22 16:48:39 np0005592767 systemd[1]: Reloading.
Jan 22 16:48:39 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:48:39 np0005592767 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 22 16:48:39 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 16:48:39 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 16:48:39 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 16:49:47 np0005592767 kernel: SELinux:  Converting 2723 SID table entries...
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:49:47 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:49:48 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 22 16:49:48 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:49:48 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:49:48 np0005592767 systemd[1]: Reloading.
Jan 22 16:49:48 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:49:48 np0005592767 systemd[1]: Starting dnf makecache...
Jan 22 16:49:48 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:49:48 np0005592767 dnf[34372]: Failed determining last makecache time.
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-openstack-barbican-42b4c41831408a8e323  90 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 147 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-openstack-cinder-1c00d6490d88e436f26ef 157 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-python-stevedore-c4acc5639fd2329372142 161 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-python-cloudkitty-tests-tempest-2c80f8 163 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-os-refresh-config-9bfc52b5049be2d8de61 155 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 150 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-python-designate-tests-tempest-347fdbc 164 kB/s | 3.0 kB     00:00
Jan 22 16:49:48 np0005592767 dnf[34372]: delorean-openstack-glance-1fd12c29b339f30fe823e 146 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 132 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-manila-3c01b7181572c95dac462 153 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-python-whitebox-neutron-tests-tempest- 156 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-octavia-ba397f07a7331190208c 158 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-watcher-c014f81a8647287f6dcc 171 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-ansible-config_template-5ccaa22121a7ff 169 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 163 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-swift-dc98a8463506ac520c469a 142 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-python-tempestconf-8515371b7cceebd4282 161 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: delorean-openstack-heat-ui-013accbfd179753bc3f0 156 kB/s | 3.0 kB     00:00
Jan 22 16:49:49 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:49:49 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:49:49 np0005592767 systemd[1]: man-db-cache-update.service: Consumed 1.035s CPU time.
Jan 22 16:49:49 np0005592767 systemd[1]: run-r7cfdce8d6dca4e91be76387ef85fa75e.service: Deactivated successfully.
Jan 22 16:49:49 np0005592767 dnf[34372]: CentOS Stream 9 - BaseOS                         53 kB/s | 6.7 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: CentOS Stream 9 - AppStream                      61 kB/s | 6.8 kB     00:00
Jan 22 16:49:49 np0005592767 python3.9[35298]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:49:49 np0005592767 dnf[34372]: CentOS Stream 9 - CRB                            28 kB/s | 6.6 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: CentOS Stream 9 - Extras packages                71 kB/s | 7.3 kB     00:00
Jan 22 16:49:49 np0005592767 dnf[34372]: dlrn-antelope-testing                           159 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: dlrn-antelope-build-deps                        155 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: centos9-rabbitmq                                 33 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: centos9-storage                                 130 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: centos9-opstools                                 35 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: NFV SIG OpenvSwitch                              71 kB/s | 3.0 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: repo-setup-centos-appstream                     110 kB/s | 4.4 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: repo-setup-centos-baseos                        100 kB/s | 3.9 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: repo-setup-centos-highavailability              100 kB/s | 3.9 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: repo-setup-centos-powertools                    105 kB/s | 4.3 kB     00:00
Jan 22 16:49:50 np0005592767 dnf[34372]: Extra Packages for Enterprise Linux 9 - x86_64  195 kB/s |  28 kB     00:00
Jan 22 16:49:51 np0005592767 dnf[34372]: Metadata cache created.
Jan 22 16:49:51 np0005592767 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 16:49:51 np0005592767 systemd[1]: Finished dnf makecache.
Jan 22 16:49:51 np0005592767 systemd[1]: dnf-makecache.service: Consumed 1.721s CPU time.
Jan 22 16:49:51 np0005592767 python3.9[35600]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 22 16:49:52 np0005592767 python3.9[35752]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 22 16:49:54 np0005592767 python3.9[35906]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:49:56 np0005592767 python3.9[36058]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 22 16:49:58 np0005592767 python3.9[36210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:49:58 np0005592767 python3.9[36362]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:01 np0005592767 python3.9[36485]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118598.5533082-669-137647133801368/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:07 np0005592767 python3.9[36637]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:08 np0005592767 python3.9[36789]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:09 np0005592767 python3.9[36942]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:50:10 np0005592767 python3.9[37094]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 22 16:50:10 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:50:10 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:50:11 np0005592767 python3.9[37248]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:50:13 np0005592767 python3.9[37406]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:50:14 np0005592767 python3.9[37566]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 22 16:50:15 np0005592767 python3.9[37719]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:50:16 np0005592767 python3.9[37877]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 22 16:50:17 np0005592767 python3.9[38029]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:50:20 np0005592767 python3.9[38182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:21 np0005592767 python3.9[38334]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:22 np0005592767 python3.9[38457]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118621.029875-1026-10856754089832/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:23 np0005592767 python3.9[38609]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:50:23 np0005592767 systemd[1]: Starting Load Kernel Modules...
Jan 22 16:50:23 np0005592767 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 22 16:50:23 np0005592767 kernel: Bridge firewalling registered
Jan 22 16:50:23 np0005592767 systemd-modules-load[38613]: Inserted module 'br_netfilter'
Jan 22 16:50:23 np0005592767 systemd[1]: Finished Load Kernel Modules.
Jan 22 16:50:24 np0005592767 python3.9[38768]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:50:24 np0005592767 python3.9[38891]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118623.6914635-1095-212486031123155/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:50:25 np0005592767 python3.9[39043]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:50:30 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 16:50:30 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 16:50:30 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:50:30 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:50:30 np0005592767 systemd[1]: Reloading.
Jan 22 16:50:30 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:30 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:50:34 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:50:34 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:50:34 np0005592767 systemd[1]: man-db-cache-update.service: Consumed 4.256s CPU time.
Jan 22 16:50:34 np0005592767 systemd[1]: run-rebdccab73bbf45909a76752933301c1c.service: Deactivated successfully.
Jan 22 16:50:34 np0005592767 python3.9[42757]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:35 np0005592767 python3.9[42909]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 22 16:50:36 np0005592767 python3.9[43059]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:50:37 np0005592767 python3.9[43211]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:37 np0005592767 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 16:50:37 np0005592767 systemd[1]: Starting Authorization Manager...
Jan 22 16:50:37 np0005592767 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 16:50:37 np0005592767 polkitd[43428]: Started polkitd version 0.117
Jan 22 16:50:37 np0005592767 systemd[1]: Started Authorization Manager.
Jan 22 16:50:39 np0005592767 python3.9[43598]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:39 np0005592767 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 22 16:50:39 np0005592767 systemd[1]: tuned.service: Deactivated successfully.
Jan 22 16:50:39 np0005592767 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 22 16:50:39 np0005592767 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 22 16:50:39 np0005592767 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 22 16:50:40 np0005592767 python3.9[43760]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 22 16:50:40 np0005592767 irqbalance[797]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 22 16:50:40 np0005592767 irqbalance[797]: IRQ 26 affinity is now unmanaged
Jan 22 16:50:43 np0005592767 python3.9[43912]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:43 np0005592767 systemd[1]: Reloading.
Jan 22 16:50:43 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:44 np0005592767 python3.9[44100]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:50:44 np0005592767 systemd[1]: Reloading.
Jan 22 16:50:44 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:50:46 np0005592767 python3.9[44288]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:46 np0005592767 python3.9[44441]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:46 np0005592767 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 22 16:50:47 np0005592767 python3.9[44594]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:49 np0005592767 python3.9[44756]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:50:50 np0005592767 python3.9[44909]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:50:50 np0005592767 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 22 16:50:50 np0005592767 systemd[1]: Stopped Apply Kernel Variables.
Jan 22 16:50:50 np0005592767 systemd[1]: Stopping Apply Kernel Variables...
Jan 22 16:50:50 np0005592767 systemd[1]: Starting Apply Kernel Variables...
Jan 22 16:50:50 np0005592767 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 22 16:50:50 np0005592767 systemd[1]: Finished Apply Kernel Variables.
Jan 22 16:50:52 np0005592767 systemd[1]: session-10.scope: Deactivated successfully.
Jan 22 16:50:52 np0005592767 systemd[1]: session-10.scope: Consumed 2min 13.915s CPU time.
Jan 22 16:50:52 np0005592767 systemd-logind[802]: Session 10 logged out. Waiting for processes to exit.
Jan 22 16:50:52 np0005592767 systemd-logind[802]: Removed session 10.
Jan 22 16:50:58 np0005592767 systemd-logind[802]: New session 11 of user zuul.
Jan 22 16:50:58 np0005592767 systemd[1]: Started Session 11 of User zuul.
Jan 22 16:50:59 np0005592767 python3.9[45092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:51:00 np0005592767 python3.9[45246]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:51:01 np0005592767 python3.9[45402]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:03 np0005592767 python3.9[45553]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:51:04 np0005592767 python3.9[45709]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:51:04 np0005592767 python3.9[45795]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:51:06 np0005592767 python3.9[45948]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:51:10 np0005592767 python3.9[46119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:10 np0005592767 python3.9[46271]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:51:10 np0005592767 systemd[1]: var-lib-containers-storage-overlay-compat1599318948-merged.mount: Deactivated successfully.
Jan 22 16:51:10 np0005592767 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1959939743-merged.mount: Deactivated successfully.
Jan 22 16:51:10 np0005592767 podman[46272]: 2026-01-22 21:51:10.908751542 +0000 UTC m=+0.048855337 system refresh
Jan 22 16:51:11 np0005592767 python3.9[46435]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:11 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:51:14 np0005592767 python3.9[46558]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118671.2062855-289-129210784786619/.source.json follow=False _original_basename=podman_network_config.j2 checksum=f9631971c8eec37a28a41477d33ddcb545233057 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:51:15 np0005592767 python3.9[46710]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:51:16 np0005592767 python3.9[46833]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118675.2185013-334-50312416459814/.source.conf follow=False _original_basename=registries.conf.j2 checksum=3d06d4b51e1ab18af024121ef8fec31b3fc3dc21 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:51:17 np0005592767 python3.9[46985]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:51:17 np0005592767 python3.9[47137]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:51:18 np0005592767 python3.9[47289]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:51:18 np0005592767 python3.9[47441]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:51:20 np0005592767 python3.9[47591]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:51:20 np0005592767 python3.9[47745]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:22 np0005592767 python3.9[47898]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:26 np0005592767 python3.9[48058]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:28 np0005592767 python3.9[48211]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:30 np0005592767 python3.9[48364]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:33 np0005592767 python3.9[48520]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:38 np0005592767 python3.9[48690]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:51:40 np0005592767 python3.9[48843]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:52:36 np0005592767 python3.9[49179]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:52:39 np0005592767 python3.9[49335]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:52:42 np0005592767 python3.9[49492]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:52:43 np0005592767 python3.9[49667]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:52:44 np0005592767 python3.9[49790]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769118763.1237643-808-228606269379783/.source.json _original_basename=.b8tn7q2_ follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:52:45 np0005592767 python3.9[49942]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:52:45 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:52:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay-compat3259275697-lower\x2dmapped.mount: Deactivated successfully.
Jan 22 16:52:57 np0005592767 podman[49954]: 2026-01-22 21:52:57.28680601 +0000 UTC m=+11.658748694 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:52:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:52:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:52:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:00 np0005592767 python3.9[50248]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:53:00 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:35 np0005592767 podman[50260]: 2026-01-22 21:53:35.267586415 +0000 UTC m=+34.887516141 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 16:53:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:36 np0005592767 python3.9[50542]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:53:36 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:39 np0005592767 podman[50554]: 2026-01-22 21:53:39.700647062 +0000 UTC m=+3.106860452 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 16:53:39 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:39 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:39 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:40 np0005592767 python3.9[50809]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 22 16:53:42 np0005592767 podman[50821]: 2026-01-22 21:53:42.477639097 +0000 UTC m=+1.969003972 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 16:53:42 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:42 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:42 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:53:45 np0005592767 systemd-logind[802]: Session 11 logged out. Waiting for processes to exit.
Jan 22 16:53:45 np0005592767 systemd[1]: session-11.scope: Deactivated successfully.
Jan 22 16:53:45 np0005592767 systemd[1]: session-11.scope: Consumed 1min 51.431s CPU time.
Jan 22 16:53:45 np0005592767 systemd-logind[802]: Removed session 11.
Jan 22 16:53:51 np0005592767 systemd-logind[802]: New session 12 of user zuul.
Jan 22 16:53:51 np0005592767 systemd[1]: Started Session 12 of User zuul.
Jan 22 16:53:52 np0005592767 python3.9[51120]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:53:53 np0005592767 python3.9[51276]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 22 16:53:54 np0005592767 python3.9[51429]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 16:53:55 np0005592767 python3.9[51587]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 16:53:56 np0005592767 python3.9[51747]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:53:57 np0005592767 python3.9[51831]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:54:00 np0005592767 python3.9[51993]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:54:17 np0005592767 kernel: SELinux:  Converting 2737 SID table entries...
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:54:17 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:54:19 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 22 16:54:19 np0005592767 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 22 16:54:20 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:54:20 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:54:20 np0005592767 systemd[1]: Reloading.
Jan 22 16:54:20 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:54:20 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:54:20 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:54:21 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:54:21 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:54:21 np0005592767 systemd[1]: run-r8d0ff344770f4dc480474c6d007e0b87.service: Deactivated successfully.
Jan 22 16:54:23 np0005592767 python3.9[53093]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:54:23 np0005592767 systemd[1]: Reloading.
Jan 22 16:54:23 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:54:23 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:54:23 np0005592767 systemd[1]: Starting Open vSwitch Database Unit...
Jan 22 16:54:23 np0005592767 chown[53135]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 22 16:54:23 np0005592767 ovs-ctl[53140]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 22 16:54:23 np0005592767 ovs-ctl[53140]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 22 16:54:23 np0005592767 ovs-ctl[53140]: Starting ovsdb-server [  OK  ]
Jan 22 16:54:23 np0005592767 ovs-vsctl[53189]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 22 16:54:23 np0005592767 ovs-vsctl[53205]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"e130c2ec-fef7-4ed2-892d-1e3d7eaab401\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 22 16:54:23 np0005592767 ovs-ctl[53140]: Configuring Open vSwitch system IDs [  OK  ]
Jan 22 16:54:23 np0005592767 ovs-vsctl[53215]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 22 16:54:23 np0005592767 ovs-ctl[53140]: Enabling remote OVSDB managers [  OK  ]
Jan 22 16:54:23 np0005592767 systemd[1]: Started Open vSwitch Database Unit.
Jan 22 16:54:23 np0005592767 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 22 16:54:23 np0005592767 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 22 16:54:23 np0005592767 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 22 16:54:24 np0005592767 kernel: openvswitch: Open vSwitch switching datapath
Jan 22 16:54:24 np0005592767 ovs-ctl[53260]: Inserting openvswitch module [  OK  ]
Jan 22 16:54:24 np0005592767 ovs-ctl[53229]: Starting ovs-vswitchd [  OK  ]
Jan 22 16:54:24 np0005592767 ovs-vsctl[53277]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 22 16:54:24 np0005592767 ovs-ctl[53229]: Enabling remote OVSDB managers [  OK  ]
Jan 22 16:54:24 np0005592767 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 22 16:54:24 np0005592767 systemd[1]: Starting Open vSwitch...
Jan 22 16:54:24 np0005592767 systemd[1]: Finished Open vSwitch.
Jan 22 16:54:25 np0005592767 python3.9[53429]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:54:26 np0005592767 python3.9[53581]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 22 16:54:27 np0005592767 kernel: SELinux:  Converting 2751 SID table entries...
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 16:54:27 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 16:54:29 np0005592767 python3.9[53736]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:54:29 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 22 16:54:30 np0005592767 python3.9[53894]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:54:32 np0005592767 python3.9[54047]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:54:34 np0005592767 python3.9[54334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 22 16:54:35 np0005592767 python3.9[54484]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:54:35 np0005592767 python3.9[54638]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:54:37 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:54:37 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:54:37 np0005592767 systemd[1]: Reloading.
Jan 22 16:54:37 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:54:37 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:54:37 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:54:38 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:54:38 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:54:38 np0005592767 systemd[1]: run-rdab2c92dfdee4755b0fc3004f278f145.service: Deactivated successfully.
Jan 22 16:54:39 np0005592767 python3.9[54955]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:54:39 np0005592767 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 22 16:54:39 np0005592767 systemd[1]: Stopped Network Manager Wait Online.
Jan 22 16:54:39 np0005592767 systemd[1]: Stopping Network Manager Wait Online...
Jan 22 16:54:39 np0005592767 systemd[1]: Stopping Network Manager...
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.0660] caught SIGTERM, shutting down normally.
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.0676] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.0676] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.0676] dhcp4 (eth0): state changed no lease
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.0679] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:54:39 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:54:39 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:54:39 np0005592767 NetworkManager[7198]: <info>  [1769118879.4126] exiting (success)
Jan 22 16:54:39 np0005592767 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 22 16:54:39 np0005592767 systemd[1]: Stopped Network Manager.
Jan 22 16:54:39 np0005592767 systemd[1]: NetworkManager.service: Consumed 16.311s CPU time, 4.0M memory peak, read 0B from disk, written 36.5K to disk.
Jan 22 16:54:39 np0005592767 systemd[1]: Starting Network Manager...
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.5065] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:339c1445-6b44-44ff-b543-d72e4d6762b9)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.5066] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.5132] manager[0x55d1ba212000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 22 16:54:39 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 16:54:39 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6033] hostname: hostname: using hostnamed
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6034] hostname: static hostname changed from (none) to "compute-2"
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6038] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6044] manager[0x55d1ba212000]: rfkill: Wi-Fi hardware radio set enabled
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6044] manager[0x55d1ba212000]: rfkill: WWAN hardware radio set enabled
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6068] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6078] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6079] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6080] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6080] manager: Networking is enabled by state file
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6083] settings: Loaded settings plugin: keyfile (internal)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6086] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6115] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6124] dhcp: init: Using DHCP client 'internal'
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6126] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6131] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6137] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6145] device (lo): Activation: starting connection 'lo' (7e49d025-cbad-4eaf-af6a-04deb1852570)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6153] device (eth0): carrier: link connected
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6157] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6163] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6163] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6169] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6175] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6182] device (eth1): carrier: link connected
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6187] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6193] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e) (indicated)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6194] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6199] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6206] device (eth1): Activation: starting connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e)
Jan 22 16:54:39 np0005592767 systemd[1]: Started Network Manager.
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6212] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6234] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6237] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6239] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6241] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6244] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6247] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6249] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6253] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6263] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6267] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6279] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6298] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6310] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6313] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6319] device (lo): Activation: successful, device activated.
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6328] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6335] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 22 16:54:39 np0005592767 systemd[1]: Starting Network Manager Wait Online...
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6419] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6430] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6436] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6440] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6443] device (eth1): Activation: successful, device activated.
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6456] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6458] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6461] manager: NetworkManager state is now CONNECTED_SITE
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6465] device (eth0): Activation: successful, device activated.
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6470] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 22 16:54:39 np0005592767 NetworkManager[54973]: <info>  [1769118879.6473] manager: startup complete
Jan 22 16:54:39 np0005592767 systemd[1]: Finished Network Manager Wait Online.
Jan 22 16:54:40 np0005592767 python3.9[55181]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:54:46 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 16:54:46 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 16:54:46 np0005592767 systemd[1]: Reloading.
Jan 22 16:54:46 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:54:46 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:54:46 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 16:54:48 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 16:54:48 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 16:54:48 np0005592767 systemd[1]: run-r38586cab6e20495889d11b5b7234c9e9.service: Deactivated successfully.
Jan 22 16:54:49 np0005592767 python3.9[55640]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:54:49 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:54:50 np0005592767 python3.9[55792]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:50 np0005592767 python3.9[55946]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:51 np0005592767 python3.9[56098]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:52 np0005592767 python3.9[56250]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:52 np0005592767 python3.9[56402]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:53 np0005592767 python3.9[56554]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:54:54 np0005592767 python3.9[56677]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118893.3663278-649-161576152748148/.source _original_basename=.nuo271ut follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:55 np0005592767 python3.9[56829]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:55 np0005592767 python3.9[56981]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 22 16:54:56 np0005592767 python3.9[57133]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:54:59 np0005592767 python3.9[57560]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 22 16:55:00 np0005592767 ansible-async_wrapper.py[57735]: Invoked with j636444412223 300 /home/zuul/.ansible/tmp/ansible-tmp-1769118899.4140472-847-97416222808781/AnsiballZ_edpm_os_net_config.py _
Jan 22 16:55:00 np0005592767 ansible-async_wrapper.py[57738]: Starting module and watcher
Jan 22 16:55:00 np0005592767 ansible-async_wrapper.py[57738]: Start watching 57739 (300)
Jan 22 16:55:00 np0005592767 ansible-async_wrapper.py[57739]: Start module (57739)
Jan 22 16:55:00 np0005592767 ansible-async_wrapper.py[57735]: Return async_wrapper task started.
Jan 22 16:55:00 np0005592767 python3.9[57740]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 22 16:55:01 np0005592767 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 22 16:55:01 np0005592767 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 22 16:55:01 np0005592767 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 22 16:55:01 np0005592767 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 22 16:55:01 np0005592767 kernel: cfg80211: failed to load regulatory.db
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9159] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9180] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9681] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9685] audit: op="connection-add" uuid="da4cd46d-b69a-4802-a375-e3ec205c55ca" name="br-ex-br" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9699] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9700] audit: op="connection-add" uuid="a4c0338c-6879-40d2-a197-0a29e2019045" name="br-ex-port" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9710] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9712] audit: op="connection-add" uuid="04de40ce-ef45-45ae-831e-ab1206367625" name="eth1-port" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9721] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9723] audit: op="connection-add" uuid="c63e620a-1bfb-490f-80e7-df5ee0d88033" name="vlan20-port" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9732] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9733] audit: op="connection-add" uuid="443d0d2b-88ef-41b4-91c6-78520b59f4c1" name="vlan21-port" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9742] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9743] audit: op="connection-add" uuid="3340d1eb-8d50-4b4d-ac40-1ad7a1bd2483" name="vlan22-port" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9760] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=57741 uid=0 result="success"
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9773] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 22 16:55:01 np0005592767 NetworkManager[54973]: <info>  [1769118901.9774] audit: op="connection-add" uuid="23c4dadb-9bfd-4589-a8f5-8737d22fc307" name="br-ex-if" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2723] audit: op="connection-update" uuid="7e92cb46-b77b-515c-ae23-f1ff106f5b6e" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv4.never-default,ipv4.routes,ipv4.addresses,ipv4.routing-rules,ipv4.dns,ipv4.method,connection.controller,connection.slave-type,connection.master,connection.port-type,connection.timestamp,ipv6.method,ipv6.routes,ipv6.addresses,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.dns" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2762] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2768] audit: op="connection-add" uuid="9e050a30-4c8b-4b6e-9720-e8bc68c830c8" name="vlan20-if" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2801] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2806] audit: op="connection-add" uuid="baa12737-3f6f-428e-8da9-98d31cadad2e" name="vlan21-if" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2836] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2840] audit: op="connection-add" uuid="c96eb2a5-e3db-4d5b-9c4b-c3e6f38974d4" name="vlan22-if" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2861] audit: op="connection-delete" uuid="42da23fc-7953-36b0-b86e-9cee7da143fd" name="Wired connection 1" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2884] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.2890] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2904] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2913] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (da4cd46d-b69a-4802-a375-e3ec205c55ca)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2916] audit: op="connection-activate" uuid="da4cd46d-b69a-4802-a375-e3ec205c55ca" name="br-ex-br" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2923] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.2927] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2939] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2948] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (a4c0338c-6879-40d2-a197-0a29e2019045)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2953] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.2956] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2966] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2976] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (04de40ce-ef45-45ae-831e-ab1206367625)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.2980] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.2988] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3003] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3012] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (c63e620a-1bfb-490f-80e7-df5ee0d88033)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3016] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3018] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3028] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3036] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (443d0d2b-88ef-41b4-91c6-78520b59f4c1)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3039] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3041] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3051] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3058] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (3340d1eb-8d50-4b4d-ac40-1ad7a1bd2483)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3060] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3066] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3070] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3086] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3089] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3095] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3105] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (23c4dadb-9bfd-4589-a8f5-8737d22fc307)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3107] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3113] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3118] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3121] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3123] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3148] device (eth1): disconnecting for new activation request.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3150] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3157] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3162] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3164] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3168] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3170] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3178] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3189] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (9e050a30-4c8b-4b6e-9720-e8bc68c830c8)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3191] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3199] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3204] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3206] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3210] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3212] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3218] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3226] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (baa12737-3f6f-428e-8da9-98d31cadad2e)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3227] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3234] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3239] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3241] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3246] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <warn>  [1769118902.3248] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3256] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3264] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (c96eb2a5-e3db-4d5b-9c4b-c3e6f38974d4)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3266] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3273] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3277] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3280] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3284] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3308] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3313] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3319] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3324] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3342] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 kernel: ovs-system: entered promiscuous mode
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3385] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3390] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3395] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3398] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3405] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3410] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 systemd-udevd[57747]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:55:02 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:55:02 np0005592767 kernel: Timeout policy base is empty
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3412] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3414] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3421] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3426] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3430] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3432] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3436] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3440] dhcp4 (eth0): canceled DHCP transaction
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3440] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3440] dhcp4 (eth0): state changed no lease
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3442] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3455] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.3459] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57741 uid=0 result="fail" reason="Device is not activated"
Jan 22 16:55:02 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:55:02 np0005592767 kernel: br-ex: entered promiscuous mode
Jan 22 16:55:02 np0005592767 kernel: vlan20: entered promiscuous mode
Jan 22 16:55:02 np0005592767 systemd-udevd[57746]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4236] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4240] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4249] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4257] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4265] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:55:02 np0005592767 kernel: vlan21: entered promiscuous mode
Jan 22 16:55:02 np0005592767 kernel: vlan22: entered promiscuous mode
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4458] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4548] device (eth1): Activation: starting connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4554] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4556] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4558] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4559] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4561] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4564] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4570] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4590] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4600] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4604] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4609] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4616] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4621] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4628] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4633] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4639] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4645] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4650] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4655] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4660] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4665] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4669] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4675] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4681] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4684] device (eth1): released from controller device eth1
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4693] device (eth1): disconnecting for new activation request.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4694] audit: op="connection-activate" uuid="7e92cb46-b77b-515c-ae23-f1ff106f5b6e" name="ci-private-network" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4711] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4720] device (eth1): Activation: starting connection 'ci-private-network' (7e92cb46-b77b-515c-ae23-f1ff106f5b6e)
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4732] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4753] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4759] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4764] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57741 uid=0 result="success"
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4769] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4776] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4787] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4796] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4801] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4804] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4821] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4827] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4834] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4843] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4845] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4850] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4855] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4860] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4865] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4871] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4874] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4879] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4884] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4886] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.4890] device (eth1): Activation: successful, device activated.
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.7509] checkpoint[0x55d1ba1e8950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 22 16:55:02 np0005592767 NetworkManager[54973]: <info>  [1769118902.7512] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.0085] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.0106] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.1770] audit: op="networking-control" arg="global-dns-configuration" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.1796] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.1826] audit: op="networking-control" arg="global-dns-configuration" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.1843] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.2992] checkpoint[0x55d1ba1e8a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 22 16:55:03 np0005592767 NetworkManager[54973]: <info>  [1769118903.2995] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57741 uid=0 result="success"
Jan 22 16:55:03 np0005592767 ansible-async_wrapper.py[57739]: Module complete (57739)
Jan 22 16:55:04 np0005592767 python3.9[58084]: ansible-ansible.legacy.async_status Invoked with jid=j636444412223.57735 mode=status _async_dir=/root/.ansible_async
Jan 22 16:55:04 np0005592767 python3.9[58184]: ansible-ansible.legacy.async_status Invoked with jid=j636444412223.57735 mode=cleanup _async_dir=/root/.ansible_async
Jan 22 16:55:05 np0005592767 ansible-async_wrapper.py[57738]: Done in kid B.
Jan 22 16:55:08 np0005592767 python3.9[58337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:55:09 np0005592767 python3.9[58460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118908.4544282-923-39073487149977/.source.returncode _original_basename=.dit_vfj4 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:55:09 np0005592767 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 16:55:12 np0005592767 python3.9[58615]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:55:12 np0005592767 python3.9[58738]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118911.9223628-971-275328838855666/.source.cfg _original_basename=.ldb4m29l follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:55:13 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:55:13 np0005592767 python3.9[58890]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:55:13 np0005592767 systemd[1]: Reloading Network Manager...
Jan 22 16:55:13 np0005592767 NetworkManager[54973]: <info>  [1769118913.9482] audit: op="reload" arg="0" pid=58895 uid=0 result="success"
Jan 22 16:55:13 np0005592767 NetworkManager[54973]: <info>  [1769118913.9492] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 22 16:55:13 np0005592767 systemd[1]: Reloaded Network Manager.
Jan 22 16:55:13 np0005592767 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 22 16:55:14 np0005592767 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 22 16:55:14 np0005592767 systemd[1]: session-12.scope: Deactivated successfully.
Jan 22 16:55:14 np0005592767 systemd[1]: session-12.scope: Consumed 48.719s CPU time.
Jan 22 16:55:14 np0005592767 systemd-logind[802]: Session 12 logged out. Waiting for processes to exit.
Jan 22 16:55:14 np0005592767 systemd-logind[802]: Removed session 12.
Jan 22 16:55:19 np0005592767 systemd-logind[802]: New session 13 of user zuul.
Jan 22 16:55:19 np0005592767 systemd[1]: Started Session 13 of User zuul.
Jan 22 16:55:20 np0005592767 python3.9[59083]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:55:21 np0005592767 python3.9[59237]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:55:23 np0005592767 python3.9[59426]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:55:23 np0005592767 systemd[1]: session-13.scope: Deactivated successfully.
Jan 22 16:55:23 np0005592767 systemd[1]: session-13.scope: Consumed 2.154s CPU time.
Jan 22 16:55:23 np0005592767 systemd-logind[802]: Session 13 logged out. Waiting for processes to exit.
Jan 22 16:55:23 np0005592767 systemd-logind[802]: Removed session 13.
Jan 22 16:55:24 np0005592767 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 22 16:55:29 np0005592767 systemd-logind[802]: New session 14 of user zuul.
Jan 22 16:55:29 np0005592767 systemd[1]: Started Session 14 of User zuul.
Jan 22 16:55:30 np0005592767 python3.9[59610]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:55:31 np0005592767 python3.9[59764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:55:32 np0005592767 python3.9[59920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:55:33 np0005592767 python3.9[60005]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:55:35 np0005592767 python3.9[60158]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:55:36 np0005592767 python3.9[60349]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:55:37 np0005592767 python3.9[60503]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:55:37 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:55:38 np0005592767 python3.9[60666]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:55:39 np0005592767 python3.9[60744]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:55:40 np0005592767 python3.9[60896]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:55:40 np0005592767 python3.9[60974]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:55:41 np0005592767 python3.9[61126]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:55:42 np0005592767 python3.9[61278]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:55:42 np0005592767 python3.9[61430]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:55:43 np0005592767 python3.9[61582]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:55:44 np0005592767 python3.9[61736]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:55:47 np0005592767 python3.9[61889]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:55:47 np0005592767 python3.9[62043]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:55:48 np0005592767 python3.9[62195]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:55:49 np0005592767 python3.9[62347]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:55:50 np0005592767 python3.9[62500]: ansible-service_facts Invoked
Jan 22 16:55:50 np0005592767 network[62517]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:55:50 np0005592767 network[62518]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:55:50 np0005592767 network[62519]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:55:57 np0005592767 python3.9[62971]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:55:59 np0005592767 python3.9[63124]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 22 16:56:01 np0005592767 python3.9[63276]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:01 np0005592767 python3.9[63401]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118960.8307145-659-117151506518607/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:02 np0005592767 python3.9[63555]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:03 np0005592767 python3.9[63680]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118962.3005314-705-218211229229842/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:06 np0005592767 python3.9[63834]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:07 np0005592767 python3.9[63988]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:56:08 np0005592767 python3.9[64072]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:11 np0005592767 python3.9[64226]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:56:12 np0005592767 python3.9[64310]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:56:12 np0005592767 chronyd[789]: chronyd exiting
Jan 22 16:56:12 np0005592767 systemd[1]: Stopping NTP client/server...
Jan 22 16:56:12 np0005592767 systemd[1]: chronyd.service: Deactivated successfully.
Jan 22 16:56:12 np0005592767 systemd[1]: Stopped NTP client/server.
Jan 22 16:56:12 np0005592767 systemd[1]: Starting NTP client/server...
Jan 22 16:56:12 np0005592767 chronyd[64318]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 22 16:56:12 np0005592767 chronyd[64318]: Frequency -28.352 +/- 0.303 ppm read from /var/lib/chrony/drift
Jan 22 16:56:12 np0005592767 chronyd[64318]: Loaded seccomp filter (level 2)
Jan 22 16:56:12 np0005592767 systemd[1]: Started NTP client/server.
Jan 22 16:56:12 np0005592767 systemd[1]: session-14.scope: Deactivated successfully.
Jan 22 16:56:12 np0005592767 systemd[1]: session-14.scope: Consumed 24.625s CPU time.
Jan 22 16:56:12 np0005592767 systemd-logind[802]: Session 14 logged out. Waiting for processes to exit.
Jan 22 16:56:12 np0005592767 systemd-logind[802]: Removed session 14.
Jan 22 16:56:18 np0005592767 systemd-logind[802]: New session 15 of user zuul.
Jan 22 16:56:18 np0005592767 systemd[1]: Started Session 15 of User zuul.
Jan 22 16:56:19 np0005592767 python3.9[64497]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:56:20 np0005592767 python3.9[64653]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:21 np0005592767 python3.9[64828]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:22 np0005592767 python3.9[64906]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.msfqbnar recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:23 np0005592767 python3.9[65058]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:23 np0005592767 python3.9[65181]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769118982.8506281-145-178687610520537/.source _original_basename=.e1yypq1l follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:24 np0005592767 python3.9[65333]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:56:25 np0005592767 python3.9[65485]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:25 np0005592767 python3.9[65608]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118984.9363565-217-51348116998450/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:56:26 np0005592767 python3.9[65760]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:26 np0005592767 python3.9[65883]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769118985.9748707-217-75961757901329/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:56:28 np0005592767 python3.9[66035]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:28 np0005592767 python3.9[66187]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:29 np0005592767 python3.9[66310]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118988.3754468-328-37408918682279/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:30 np0005592767 python3.9[66462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:30 np0005592767 python3.9[66585]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118989.731714-373-85929006641235/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:31 np0005592767 python3.9[66737]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:31 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:32 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:32 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:32 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:32 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:32 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:32 np0005592767 systemd[1]: Starting EDPM Container Shutdown...
Jan 22 16:56:32 np0005592767 systemd[1]: Finished EDPM Container Shutdown.
Jan 22 16:56:33 np0005592767 python3.9[66966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:33 np0005592767 python3.9[67089]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118992.825025-443-163905078763474/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:34 np0005592767 python3.9[67241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:35 np0005592767 python3.9[67364]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769118994.2203124-488-259944997682616/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:36 np0005592767 python3.9[67516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:36 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:36 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:36 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:36 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:36 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:36 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:36 np0005592767 systemd[1]: Starting Create netns directory...
Jan 22 16:56:36 np0005592767 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 16:56:36 np0005592767 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 16:56:36 np0005592767 systemd[1]: Finished Create netns directory.
Jan 22 16:56:37 np0005592767 python3.9[67742]: ansible-ansible.builtin.service_facts Invoked
Jan 22 16:56:37 np0005592767 network[67759]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 16:56:37 np0005592767 network[67760]: 'network-scripts' will be removed from distribution in near future.
Jan 22 16:56:37 np0005592767 network[67761]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 16:56:41 np0005592767 python3.9[68023]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:42 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:42 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:42 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:42 np0005592767 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 22 16:56:42 np0005592767 iptables.init[68064]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 22 16:56:42 np0005592767 iptables.init[68064]: iptables: Flushing firewall rules: [  OK  ]
Jan 22 16:56:42 np0005592767 systemd[1]: iptables.service: Deactivated successfully.
Jan 22 16:56:42 np0005592767 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 22 16:56:43 np0005592767 python3.9[68260]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:46 np0005592767 python3.9[68414]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:56:46 np0005592767 systemd[1]: Reloading.
Jan 22 16:56:46 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:56:46 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:56:46 np0005592767 systemd[1]: Starting Netfilter Tables...
Jan 22 16:56:46 np0005592767 systemd[1]: Finished Netfilter Tables.
Jan 22 16:56:48 np0005592767 python3.9[68605]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:56:49 np0005592767 python3.9[68758]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:49 np0005592767 python3.9[68883]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119008.717786-695-41639915983779/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:50 np0005592767 python3.9[69036]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:56:50 np0005592767 systemd[1]: Reloading OpenSSH server daemon...
Jan 22 16:56:50 np0005592767 systemd[1]: Reloaded OpenSSH server daemon.
Jan 22 16:56:51 np0005592767 python3.9[69192]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:52 np0005592767 python3.9[69344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:52 np0005592767 python3.9[69467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119011.9019892-788-46910865888034/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:54 np0005592767 python3.9[69619]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 22 16:56:54 np0005592767 systemd[1]: Starting Time & Date Service...
Jan 22 16:56:54 np0005592767 systemd[1]: Started Time & Date Service.
Jan 22 16:56:55 np0005592767 python3.9[69775]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:55 np0005592767 python3.9[69927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:56 np0005592767 python3.9[70050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119015.5245993-893-38048230906757/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:57 np0005592767 python3.9[70202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:57 np0005592767 python3.9[70325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119016.792377-938-47057352897212/.source.yaml _original_basename=.ke3nsmh8 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:56:58 np0005592767 python3.9[70477]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:56:59 np0005592767 python3.9[70600]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119018.2109773-983-214323251695427/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:00 np0005592767 python3.9[70752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:01 np0005592767 python3.9[70905]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:02 np0005592767 python3[71058]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:57:03 np0005592767 python3.9[71210]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:57:04 np0005592767 python3.9[71333]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119022.795778-1100-228438131847439/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:05 np0005592767 python3.9[71485]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:57:06 np0005592767 python3.9[71608]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119024.746375-1145-142611708445347/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:07 np0005592767 python3.9[71760]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:57:07 np0005592767 python3.9[71883]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119026.5489638-1190-121365780280001/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:08 np0005592767 python3.9[72035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:57:09 np0005592767 python3.9[72158]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119028.1209831-1235-35087404834586/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:10 np0005592767 python3.9[72310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:57:11 np0005592767 python3.9[72433]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119029.9286838-1280-258362422911154/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:12 np0005592767 python3.9[72585]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:13 np0005592767 python3.9[72737]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:14 np0005592767 python3.9[72896]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:15 np0005592767 python3.9[73049]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:15 np0005592767 python3.9[73201]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:17 np0005592767 python3.9[73353]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 16:57:17 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:57:17 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 16:57:17 np0005592767 python3.9[73507]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 22 16:57:18 np0005592767 systemd-logind[802]: Session 15 logged out. Waiting for processes to exit.
Jan 22 16:57:18 np0005592767 systemd[1]: session-15.scope: Deactivated successfully.
Jan 22 16:57:18 np0005592767 systemd[1]: session-15.scope: Consumed 33.024s CPU time.
Jan 22 16:57:18 np0005592767 systemd-logind[802]: Removed session 15.
Jan 22 16:57:24 np0005592767 systemd-logind[802]: New session 16 of user zuul.
Jan 22 16:57:24 np0005592767 systemd[1]: Started Session 16 of User zuul.
Jan 22 16:57:24 np0005592767 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 16:57:25 np0005592767 python3.9[73690]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 22 16:57:26 np0005592767 python3.9[73842]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:57:27 np0005592767 python3.9[73994]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:57:28 np0005592767 python3.9[74146]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDsQ8HxD+acUzMW7y+ojQrFoQhnipuOmPD+pN/LYf0PEMFLjVKFPk/irRo0yydMpfEmQS+fgyBwWIpMFiyItLzghXwdJomlM4+mfL5fh7wMd1luI2YkbAn9gP2J+aUOws2xnHMA+KqDI+DlP0vFrjD5oW0pLyYX1GmHUf8iPZp1hTGPIW7hvNbWS08PbO72YvBEyGnlbvM1g5GAHYC34J/SeByxzhXxcCUBlapFnSLYSO5Iz9f9NKuEam0TX9/0fXYuBUNGGGXZCd1f3Y/de7PUYwWq9+Y4lCBYI1htjd2KgGaXTvvoAIPrbepVAGOGUgR74MjS/EPM/yfHTtAXmmqcX56DGdo5lDlCyEYaU6RwPKZ3KoZ6P+f6mh+HdgOtPNJSNPrqx+1MPZxv8YglYhGlsKP4wunP9+YobVw2L1OkQj15Ve+mq4oaTfVK11Whafo8emmZaaVWdSf/vTySrQKlGH4xuARGkl7OX5QTgl6VeucPYGRIMvNeHiXRus6No+8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKSmkRoT0KNQJUoAcRNTJPaE3FQRYD/wAwUPLYSAUsp8#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOQpt25ukwK1zMNKsuSjA5T22/WVXFcayjnSTcFTbBDbYSzIvs7g7A8Uz5saIem3Nj3Z5ICbv3FkmJqNu5uGxSk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMbqwZEYmgt4Vcgbv8u1Qt0lSNuSQB7H7j+Hl0xxZN4wkQtUIqdNOgGPMkHdeZXa0K65YhBP/+BcoETX6wD2q3UkJoRu5TCGI1SS+fFxru12xsnSzM7EpUqhYXjuj4iWDJwwXoKGvfWE4koZjdTzrpzaqTV35a6nyZ6W002lmFO3uIJpTbEX6LP+4LbKZDPRXHKgaPn+xeLkkQi/0/umRGuyzMWSc5vwOEqEgW2U7OHNlJKz7d/oB2v2A4lqecLCgqePXrngV2s4CGw0YBB6MOMtVJFTyH/DFKn3OV8pvcIUHJ1K+4ehR1J0ekr3OXcMxuqYTvLlF7oryPsV5d+AK7upNSRAGBb1TkrnQwzf3MacMNCLrerixTpO3AaKxgQwA3oDl3ZaVwRkDqga4B77+WIRtnEC9AHyxH4aIn4G8phcNinp0/Dzt2iv7fQ24qdMcBlEmnOMolXjn9P9PBl/dflKFLhViFeZcpm8v0cRoWtNM0oM9ulE/YmMMdQ6p9Qp0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOiXEreQbFNQXqTZYKwUY4lV5Q0Vn0xENHFGNNlfTBDN#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA88I7/5bkNrgFopiUfaNLL8soo268+yUoqVyiPopOEwtu6gxG5LSpaOBKcxiBaWS9dz2ydzWt+C5uaJi/r5z8s=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDYovLvWwVne9VsvTNCgmjav87jPUq/BRGKpdlwbBf5Ohf7q125s9EPd5z3jIwwmHFUYAtMZtk+wIC2FRYl56hAP7Wcc/xtJ+NZI8TsTuxchJYYfTsj0hgtMoIPz2KWqpFjbD/tGOhsqb14AxKbv3k+hH0wPGHbtB2RACT9owrJILTRRspSJsRRQsJI+KTSJ8rBRpSxkf8A7v/WOja7BcSQ8G8IuxC3RoVuocuw8/kJL/fhOaIpffzMmHR5bJSVxF1dGxgQOsBAddXZYPiQQcO7dzLlX8JJPwYyDZrCg7cGozd6AteSnPm0jphfRKNpKMEijcBHraRq2KIZu4ofBpO4jzGC1PmR60WPU7Zw59GX9Xip+6xLS06IUyIGHvj6oxvEa9NA6VhiZg7r+G1VUFo4auW4OPt+Fm2IVsaK4SwLRhlNyKOODJRENhFfYRMF+ERwyRTI030r8cuRHhYbeVOh279mpmrcU7r4Uo4V0OaBEnV9bot8fgZLWiELNo73538=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJp6xfeQzlhFSExtmr7lG0P1q/tf7XlRWYTeildkjaJT#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOltuaM3uIWGlrjiBwDU+eji0GOtYxIzYKGqfDQ/lMhMoMQkkyf0jLeN+5sZX7/5cWlTGwRhVmmyEOkGXf7OKgI=#012 create=True mode=0644 path=/tmp/ansible.jwobmes6 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:29 np0005592767 python3.9[74298]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.jwobmes6' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:30 np0005592767 python3.9[74452]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.jwobmes6 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:31 np0005592767 systemd[1]: session-16.scope: Deactivated successfully.
Jan 22 16:57:31 np0005592767 systemd[1]: session-16.scope: Consumed 3.175s CPU time.
Jan 22 16:57:31 np0005592767 systemd-logind[802]: Session 16 logged out. Waiting for processes to exit.
Jan 22 16:57:31 np0005592767 systemd-logind[802]: Removed session 16.
Jan 22 16:57:36 np0005592767 systemd-logind[802]: New session 17 of user zuul.
Jan 22 16:57:36 np0005592767 systemd[1]: Started Session 17 of User zuul.
Jan 22 16:57:37 np0005592767 python3.9[74630]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:57:39 np0005592767 python3.9[74786]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 16:57:40 np0005592767 python3.9[74940]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 16:57:41 np0005592767 python3.9[75093]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:42 np0005592767 python3.9[75246]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:57:43 np0005592767 python3.9[75400]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:44 np0005592767 python3.9[75555]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:57:44 np0005592767 systemd[1]: session-17.scope: Deactivated successfully.
Jan 22 16:57:44 np0005592767 systemd[1]: session-17.scope: Consumed 4.244s CPU time.
Jan 22 16:57:44 np0005592767 systemd-logind[802]: Session 17 logged out. Waiting for processes to exit.
Jan 22 16:57:44 np0005592767 systemd-logind[802]: Removed session 17.
Jan 22 16:57:50 np0005592767 systemd-logind[802]: New session 18 of user zuul.
Jan 22 16:57:50 np0005592767 systemd[1]: Started Session 18 of User zuul.
Jan 22 16:57:51 np0005592767 python3.9[75733]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:57:52 np0005592767 python3.9[75889]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:57:53 np0005592767 python3.9[75973]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 22 16:57:55 np0005592767 python3.9[76126]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:57:57 np0005592767 python3.9[76277]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 16:57:57 np0005592767 python3.9[76427]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:57:58 np0005592767 python3.9[76577]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:57:59 np0005592767 systemd[1]: session-18.scope: Deactivated successfully.
Jan 22 16:57:59 np0005592767 systemd[1]: session-18.scope: Consumed 5.521s CPU time.
Jan 22 16:57:59 np0005592767 systemd-logind[802]: Session 18 logged out. Waiting for processes to exit.
Jan 22 16:57:59 np0005592767 systemd-logind[802]: Removed session 18.
Jan 22 16:58:04 np0005592767 systemd-logind[802]: New session 19 of user zuul.
Jan 22 16:58:04 np0005592767 systemd[1]: Started Session 19 of User zuul.
Jan 22 16:58:05 np0005592767 python3.9[76755]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:58:07 np0005592767 python3.9[76911]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:07 np0005592767 python3.9[77063]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:08 np0005592767 python3.9[77215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:09 np0005592767 python3.9[77338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119088.0264096-153-207369654581289/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d01b51731c1cf9c122ebd3b997d276c7b281ea02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:10 np0005592767 python3.9[77490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:10 np0005592767 python3.9[77613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119089.5460196-153-163534832281873/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=db65c959f56a6e9a98b3ef4e3f4f055ca1563e1d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:11 np0005592767 python3.9[77765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:11 np0005592767 python3.9[77888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119090.676209-153-82960784525905/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1f91a8f7000af1790554d09433a90d9c5124ecc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:12 np0005592767 python3.9[78040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:13 np0005592767 python3.9[78192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:13 np0005592767 python3.9[78344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:14 np0005592767 python3.9[78467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119093.2995524-321-24771171347737/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8c67281729090cf10ddec74ddfd0d51c9affeb2c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:14 np0005592767 python3.9[78619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:15 np0005592767 python3.9[78742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119094.4103959-321-46956803618237/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=8e99c559f7307e2be5911618292e726c1ea8db3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:16 np0005592767 python3.9[78894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:16 np0005592767 python3.9[79017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119095.6194787-321-50911059041499/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=320ea0887162c395bd50a859d6bcc84934fb391c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:17 np0005592767 python3.9[79169]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:17 np0005592767 python3.9[79321]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:18 np0005592767 python3.9[79473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:19 np0005592767 python3.9[79596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119098.1871712-484-142209554158252/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=041b4c0bd2724fc5a53289d9d27166be9e4bb084 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:19 np0005592767 python3.9[79748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:20 np0005592767 python3.9[79871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119099.3935254-484-151512593489241/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=c08e8d5da31f6b1cbcb93a7e2f1ee2223d2c1be3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:20 np0005592767 python3.9[80023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:21 np0005592767 python3.9[80146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119100.520587-484-12551403522107/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=8910057c574c633eafb1732fddb1efb170c76d3b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:21 np0005592767 chronyd[64318]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 22 16:58:22 np0005592767 python3.9[80298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:22 np0005592767 python3.9[80450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:23 np0005592767 python3.9[80602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:23 np0005592767 python3.9[80725]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119102.9894004-640-150051774945188/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=3f683a6d6b48f605f0b3bfe16c9030963c5cdaf3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:24 np0005592767 python3.9[80877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:25 np0005592767 python3.9[81000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119104.0510526-640-54573365020925/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=c08e8d5da31f6b1cbcb93a7e2f1ee2223d2c1be3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:25 np0005592767 python3.9[81152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:26 np0005592767 python3.9[81275]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119105.2587397-640-190292880770565/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=90d3b2e011889df6c3b79eca06bcd3c73c57ba92 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:27 np0005592767 python3.9[81427]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:28 np0005592767 python3.9[81579]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:28 np0005592767 python3.9[81702]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119107.748806-805-16684148740564/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:29 np0005592767 python3.9[81854]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:30 np0005592767 python3.9[82006]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:30 np0005592767 python3.9[82129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119109.7854311-883-119413425756607/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:31 np0005592767 python3.9[82281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:32 np0005592767 python3.9[82433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:32 np0005592767 python3.9[82556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119111.8395922-962-53814322884860/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:33 np0005592767 python3.9[82708]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:34 np0005592767 python3.9[82860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:34 np0005592767 python3.9[82983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119113.8176222-1028-67101693729857/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:35 np0005592767 python3.9[83135]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:36 np0005592767 python3.9[83287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:36 np0005592767 python3.9[83410]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119115.7020082-1093-79322566159718/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:37 np0005592767 python3.9[83562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:38 np0005592767 python3.9[83714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:38 np0005592767 python3.9[83837]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119117.5795324-1158-28556832782898/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:39 np0005592767 python3.9[83989]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:39 np0005592767 python3.9[84141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:58:40 np0005592767 python3.9[84264]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119119.5415225-1217-151766494506363/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8e0d87282f2dc47ca81a4c1306ca50e8ae5f6c80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:58:47 np0005592767 systemd[1]: session-19.scope: Deactivated successfully.
Jan 22 16:58:47 np0005592767 systemd[1]: session-19.scope: Consumed 27.792s CPU time.
Jan 22 16:58:47 np0005592767 systemd-logind[802]: Session 19 logged out. Waiting for processes to exit.
Jan 22 16:58:47 np0005592767 systemd-logind[802]: Removed session 19.
Jan 22 16:58:52 np0005592767 systemd-logind[802]: New session 20 of user zuul.
Jan 22 16:58:52 np0005592767 systemd[1]: Started Session 20 of User zuul.
Jan 22 16:58:53 np0005592767 python3.9[84442]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:58:55 np0005592767 python3.9[84598]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:55 np0005592767 python3.9[84750]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:58:56 np0005592767 python3.9[84900]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:58:57 np0005592767 python3.9[85052]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 16:58:59 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 22 16:59:00 np0005592767 python3.9[85208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 16:59:01 np0005592767 python3.9[85292]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 16:59:03 np0005592767 python3.9[85445]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 16:59:04 np0005592767 python3[85600]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 22 16:59:05 np0005592767 python3.9[85752]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:06 np0005592767 python3.9[85904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:06 np0005592767 python3.9[85982]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:07 np0005592767 python3.9[86134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:08 np0005592767 python3.9[86212]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zv62h4i6 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:09 np0005592767 python3.9[86364]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:09 np0005592767 python3.9[86442]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:10 np0005592767 python3.9[86594]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:11 np0005592767 python3[86747]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 16:59:12 np0005592767 python3.9[86899]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:13 np0005592767 python3.9[87024]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119151.894406-433-229278258180784/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:14 np0005592767 python3.9[87176]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:14 np0005592767 python3.9[87301]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119153.5028002-478-59363677909132/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:15 np0005592767 python3.9[87453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:16 np0005592767 python3.9[87578]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119155.0062175-523-53828920567368/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:17 np0005592767 python3.9[87730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:17 np0005592767 python3.9[87855]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119156.5126271-569-121615096372255/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:18 np0005592767 python3.9[88007]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:19 np0005592767 python3.9[88132]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119158.0147183-613-17848829553879/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:19 np0005592767 python3.9[88284]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:20 np0005592767 python3.9[88436]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:21 np0005592767 python3.9[88591]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:22 np0005592767 python3.9[88743]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:23 np0005592767 python3.9[88896]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:59:24 np0005592767 python3.9[89050]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:25 np0005592767 python3.9[89205]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:26 np0005592767 python3.9[89355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 16:59:27 np0005592767 python3.9[89508]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:27 np0005592767 ovs-vsctl[89509]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 22 16:59:28 np0005592767 python3.9[89661]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:29 np0005592767 python3.9[89816]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 16:59:29 np0005592767 ovs-vsctl[89817]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 22 16:59:30 np0005592767 python3.9[89967]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:59:31 np0005592767 python3.9[90121]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:31 np0005592767 python3.9[90273]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:32 np0005592767 python3.9[90351]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:32 np0005592767 python3.9[90503]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:33 np0005592767 python3.9[90581]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:34 np0005592767 python3.9[90733]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:35 np0005592767 python3.9[90885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:35 np0005592767 python3.9[90963]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:36 np0005592767 python3.9[91115]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:37 np0005592767 python3.9[91193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:38 np0005592767 python3.9[91345]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:59:38 np0005592767 systemd[1]: Reloading.
Jan 22 16:59:38 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:59:38 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:59:39 np0005592767 python3.9[91534]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:39 np0005592767 python3.9[91612]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:40 np0005592767 python3.9[91764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:41 np0005592767 python3.9[91842]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:42 np0005592767 python3.9[91994]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 16:59:42 np0005592767 systemd[1]: Reloading.
Jan 22 16:59:42 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:59:42 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 16:59:42 np0005592767 systemd[1]: Starting Create netns directory...
Jan 22 16:59:42 np0005592767 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 16:59:42 np0005592767 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 16:59:42 np0005592767 systemd[1]: Finished Create netns directory.
Jan 22 16:59:43 np0005592767 python3.9[92187]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:44 np0005592767 python3.9[92339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:44 np0005592767 python3.9[92462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119183.9703202-1366-255691323328621/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:46 np0005592767 python3.9[92614]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:47 np0005592767 python3.9[92766]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 16:59:47 np0005592767 python3.9[92918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 16:59:48 np0005592767 python3.9[93041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119187.4843004-1465-46356459840324/.source.json _original_basename=.c4t29ygi follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:49 np0005592767 python3.9[93191]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:52 np0005592767 python3.9[93614]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 22 16:59:54 np0005592767 python3.9[93766]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 16:59:55 np0005592767 python3[93918]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 16:59:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:59:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:59:55 np0005592767 podman[93954]: 2026-01-22 21:59:55.733388464 +0000 UTC m=+0.105098235 container create ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, managed_by=edpm_ansible)
Jan 22 16:59:55 np0005592767 podman[93954]: 2026-01-22 21:59:55.649269732 +0000 UTC m=+0.020979553 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:59:55 np0005592767 python3[93918]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 22 16:59:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 22 16:59:56 np0005592767 python3.9[94145]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:59:57 np0005592767 python3.9[94299]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:58 np0005592767 python3.9[94375]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 16:59:58 np0005592767 python3.9[94526]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119198.3364182-1699-5159720467577/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 16:59:59 np0005592767 python3.9[94602]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 16:59:59 np0005592767 systemd[1]: Reloading.
Jan 22 16:59:59 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 16:59:59 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:00:00 np0005592767 python3.9[94714]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:00:00 np0005592767 systemd[1]: Reloading.
Jan 22 17:00:00 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:00:00 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:00:00 np0005592767 systemd[1]: Starting ovn_controller container...
Jan 22 17:00:00 np0005592767 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 22 17:00:00 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:00:00 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55abaafc37880e6f85779303431eee429d0a4a491aa523e731fc195128d07db8/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 22 17:00:00 np0005592767 systemd[1]: Started /usr/bin/podman healthcheck run ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319.
Jan 22 17:00:00 np0005592767 podman[94754]: 2026-01-22 22:00:00.704063917 +0000 UTC m=+0.121850617 container init ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:00:00 np0005592767 ovn_controller[94769]: + sudo -E kolla_set_configs
Jan 22 17:00:00 np0005592767 podman[94754]: 2026-01-22 22:00:00.729910106 +0000 UTC m=+0.147696796 container start ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:00:00 np0005592767 edpm-start-podman-container[94754]: ovn_controller
Jan 22 17:00:00 np0005592767 systemd[1]: Created slice User Slice of UID 0.
Jan 22 17:00:00 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 22 17:00:00 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 22 17:00:00 np0005592767 systemd[1]: Starting User Manager for UID 0...
Jan 22 17:00:00 np0005592767 edpm-start-podman-container[94753]: Creating additional drop-in dependency for "ovn_controller" (ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319)
Jan 22 17:00:00 np0005592767 systemd[1]: Reloading.
Jan 22 17:00:00 np0005592767 podman[94776]: 2026-01-22 22:00:00.831450669 +0000 UTC m=+0.090892764 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:00:00 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:00:00 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:00:00 np0005592767 systemd[94807]: Queued start job for default target Main User Target.
Jan 22 17:00:00 np0005592767 systemd[94807]: Created slice User Application Slice.
Jan 22 17:00:00 np0005592767 systemd[94807]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 22 17:00:00 np0005592767 systemd[94807]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:00:00 np0005592767 systemd[94807]: Reached target Paths.
Jan 22 17:00:00 np0005592767 systemd[94807]: Reached target Timers.
Jan 22 17:00:00 np0005592767 systemd[94807]: Starting D-Bus User Message Bus Socket...
Jan 22 17:00:00 np0005592767 systemd[94807]: Starting Create User's Volatile Files and Directories...
Jan 22 17:00:00 np0005592767 systemd[94807]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:00:00 np0005592767 systemd[94807]: Finished Create User's Volatile Files and Directories.
Jan 22 17:00:00 np0005592767 systemd[94807]: Reached target Sockets.
Jan 22 17:00:00 np0005592767 systemd[94807]: Reached target Basic System.
Jan 22 17:00:00 np0005592767 systemd[94807]: Reached target Main User Target.
Jan 22 17:00:00 np0005592767 systemd[94807]: Startup finished in 130ms.
Jan 22 17:00:01 np0005592767 systemd[1]: Started User Manager for UID 0.
Jan 22 17:00:01 np0005592767 systemd[1]: ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319-49860aee198b0930.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 17:00:01 np0005592767 systemd[1]: ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319-49860aee198b0930.service: Failed with result 'exit-code'.
Jan 22 17:00:01 np0005592767 systemd[1]: Started ovn_controller container.
Jan 22 17:00:01 np0005592767 systemd[1]: Started Session c1 of User root.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: INFO:__main__:Validating config file
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: INFO:__main__:Writing out command to execute
Jan 22 17:00:01 np0005592767 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: ++ cat /run_command
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + ARGS=
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + sudo kolla_copy_cacerts
Jan 22 17:00:01 np0005592767 systemd[1]: Started Session c2 of User root.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + [[ ! -n '' ]]
Jan 22 17:00:01 np0005592767 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + . kolla_extend_start
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + umask 0022
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2083] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2093] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <warn>  [1769119201.2095] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2101] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2105] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2108] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 22 17:00:01 np0005592767 kernel: br-int: entered promiscuous mode
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 17:00:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:01Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2352] manager: (ovn-bdc194-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 22 17:00:01 np0005592767 systemd-udevd[94901]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:00:01 np0005592767 systemd-udevd[94902]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:00:01 np0005592767 kernel: genev_sys_6081: entered promiscuous mode
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2555] device (genev_sys_6081): carrier: link connected
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.2559] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.6472] manager: (ovn-b36a49-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 22 17:00:01 np0005592767 NetworkManager[54973]: <info>  [1769119201.7903] manager: (ovn-67ef43-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 22 17:00:02 np0005592767 python3.9[95032]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 17:00:03 np0005592767 python3.9[95184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:04 np0005592767 python3.9[95307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119203.2576923-1834-95144665484108/.source.yaml _original_basename=.35i2xwd5 follow=False checksum=4c948b03318d4125f88ccbb3951023e44c0d629c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:05 np0005592767 python3.9[95459]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:00:05 np0005592767 ovs-vsctl[95460]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 22 17:00:05 np0005592767 python3.9[95612]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:00:05 np0005592767 ovs-vsctl[95614]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 22 17:00:07 np0005592767 python3.9[95767]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:00:07 np0005592767 ovs-vsctl[95768]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 22 17:00:07 np0005592767 systemd[1]: session-20.scope: Deactivated successfully.
Jan 22 17:00:07 np0005592767 systemd[1]: session-20.scope: Consumed 45.612s CPU time.
Jan 22 17:00:07 np0005592767 systemd-logind[802]: Session 20 logged out. Waiting for processes to exit.
Jan 22 17:00:07 np0005592767 systemd-logind[802]: Removed session 20.
Jan 22 17:00:11 np0005592767 systemd[1]: Stopping User Manager for UID 0...
Jan 22 17:00:11 np0005592767 systemd[94807]: Activating special unit Exit the Session...
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped target Main User Target.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped target Basic System.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped target Paths.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped target Sockets.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped target Timers.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:00:11 np0005592767 systemd[94807]: Closed D-Bus User Message Bus Socket.
Jan 22 17:00:11 np0005592767 systemd[94807]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:00:11 np0005592767 systemd[94807]: Removed slice User Application Slice.
Jan 22 17:00:11 np0005592767 systemd[94807]: Reached target Shutdown.
Jan 22 17:00:11 np0005592767 systemd[94807]: Finished Exit the Session.
Jan 22 17:00:11 np0005592767 systemd[94807]: Reached target Exit the Session.
Jan 22 17:00:11 np0005592767 systemd[1]: user@0.service: Deactivated successfully.
Jan 22 17:00:11 np0005592767 systemd[1]: Stopped User Manager for UID 0.
Jan 22 17:00:11 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 22 17:00:11 np0005592767 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 22 17:00:11 np0005592767 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 22 17:00:11 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 22 17:00:11 np0005592767 systemd[1]: Removed slice User Slice of UID 0.
Jan 22 17:00:12 np0005592767 systemd-logind[802]: New session 22 of user zuul.
Jan 22 17:00:12 np0005592767 systemd[1]: Started Session 22 of User zuul.
Jan 22 17:00:13 np0005592767 python3.9[95951]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:00:15 np0005592767 python3.9[96107]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:15 np0005592767 python3.9[96259]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:16 np0005592767 python3.9[96411]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:16 np0005592767 python3.9[96563]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:17 np0005592767 python3.9[96715]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:19 np0005592767 python3.9[96865]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:00:19 np0005592767 python3.9[97018]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 22 17:00:21 np0005592767 python3.9[97168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:22 np0005592767 python3.9[97289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119220.9030602-221-35253534227667/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:22 np0005592767 python3.9[97439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:23 np0005592767 python3.9[97560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119222.367046-265-192708685930492/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:24 np0005592767 python3.9[97712]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 17:00:25 np0005592767 python3.9[97796]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 17:00:27 np0005592767 python3.9[97949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:00:28 np0005592767 python3.9[98102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:29 np0005592767 python3.9[98223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119228.2339675-377-180472003444941/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:29 np0005592767 python3.9[98373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:30 np0005592767 python3.9[98494]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119229.416054-377-35440088512474/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:31Z|00025|memory|INFO|16384 kB peak resident set size after 30.0 seconds
Jan 22 17:00:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:00:31Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 22 17:00:31 np0005592767 podman[98519]: 2026-01-22 22:00:31.214400914 +0000 UTC m=+0.121774946 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 17:00:31 np0005592767 python3.9[98671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:32 np0005592767 python3.9[98792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119231.4656985-508-48748199733915/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:33 np0005592767 python3.9[98942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:33 np0005592767 python3.9[99063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119232.6107535-508-118587081523585/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:34 np0005592767 python3.9[99213]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:00:35 np0005592767 python3.9[99367]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:36 np0005592767 python3.9[99519]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:36 np0005592767 python3.9[99597]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:37 np0005592767 python3.9[99749]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:37 np0005592767 python3.9[99827]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:38 np0005592767 python3.9[99979]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:39 np0005592767 python3.9[100131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:40 np0005592767 python3.9[100209]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:40 np0005592767 python3.9[100361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:41 np0005592767 python3.9[100439]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:42 np0005592767 python3.9[100591]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:00:42 np0005592767 systemd[1]: Reloading.
Jan 22 17:00:42 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:00:42 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:00:43 np0005592767 python3.9[100780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:43 np0005592767 python3.9[100858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:44 np0005592767 python3.9[101010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:45 np0005592767 python3.9[101088]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:45 np0005592767 python3.9[101240]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:00:46 np0005592767 systemd[1]: Reloading.
Jan 22 17:00:46 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:00:46 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:00:46 np0005592767 systemd[1]: Starting Create netns directory...
Jan 22 17:00:46 np0005592767 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 22 17:00:46 np0005592767 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 22 17:00:46 np0005592767 systemd[1]: Finished Create netns directory.
Jan 22 17:00:47 np0005592767 python3.9[101435]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:48 np0005592767 python3.9[101587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:48 np0005592767 python3.9[101710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119247.5652392-962-100722288863211/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:49 np0005592767 python3.9[101862]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:50 np0005592767 python3.9[102014]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:00:51 np0005592767 python3.9[102166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:00:51 np0005592767 python3.9[102289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119250.6058087-1060-27935262275253/.source.json _original_basename=.h71f45xh follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:52 np0005592767 python3.9[102439]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:00:54 np0005592767 python3.9[102862]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 22 17:00:56 np0005592767 python3.9[103014]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 17:00:57 np0005592767 python3[103166]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 17:01:02 np0005592767 podman[103237]: 2026-01-22 22:01:02.704153762 +0000 UTC m=+0.610211814 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Jan 22 17:01:04 np0005592767 podman[103178]: 2026-01-22 22:01:04.223350837 +0000 UTC m=+6.870113954 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:01:04 np0005592767 podman[103317]: 2026-01-22 22:01:04.39845592 +0000 UTC m=+0.055933701 container create 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 22 17:01:04 np0005592767 podman[103317]: 2026-01-22 22:01:04.369973311 +0000 UTC m=+0.027451122 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:01:04 np0005592767 python3[103166]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:01:05 np0005592767 python3.9[103506]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:01:07 np0005592767 python3.9[103660]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:07 np0005592767 python3.9[103736]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:01:08 np0005592767 python3.9[103887]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119267.6573963-1294-163016553184322/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:08 np0005592767 python3.9[103963]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:01:08 np0005592767 systemd[1]: Reloading.
Jan 22 17:01:08 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:01:08 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:01:09 np0005592767 python3.9[104073]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:09 np0005592767 systemd[1]: Reloading.
Jan 22 17:01:09 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:01:09 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:01:09 np0005592767 systemd[1]: Starting ovn_metadata_agent container...
Jan 22 17:01:10 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:01:10 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98b69c507be6983d158fbc861ce30e6c1112780d380c32ad7a30cb7234102637/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 22 17:01:10 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98b69c507be6983d158fbc861ce30e6c1112780d380c32ad7a30cb7234102637/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:01:10 np0005592767 systemd[1]: Started /usr/bin/podman healthcheck run 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c.
Jan 22 17:01:10 np0005592767 podman[104114]: 2026-01-22 22:01:10.094827052 +0000 UTC m=+0.140681190 container init 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + sudo -E kolla_set_configs
Jan 22 17:01:10 np0005592767 podman[104114]: 2026-01-22 22:01:10.125609315 +0000 UTC m=+0.171463423 container start 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:01:10 np0005592767 edpm-start-podman-container[104114]: ovn_metadata_agent
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Validating config file
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 17:01:10 np0005592767 edpm-start-podman-container[104113]: Creating additional drop-in dependency for "ovn_metadata_agent" (67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c)
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Copying service configuration files
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Writing out command to execute
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: ++ cat /run_command
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + CMD=neutron-ovn-metadata-agent
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + ARGS=
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + sudo kolla_copy_cacerts
Jan 22 17:01:10 np0005592767 systemd[1]: Reloading.
Jan 22 17:01:10 np0005592767 podman[104137]: 2026-01-22 22:01:10.232245421 +0000 UTC m=+0.092724181 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + [[ ! -n '' ]]
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + . kolla_extend_start
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: Running command: 'neutron-ovn-metadata-agent'
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + umask 0022
Jan 22 17:01:10 np0005592767 ovn_metadata_agent[104130]: + exec neutron-ovn-metadata-agent
Jan 22 17:01:10 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:01:10 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:01:10 np0005592767 systemd[1]: Started ovn_metadata_agent container.
Jan 22 17:01:11 np0005592767 python3.9[104369]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.032 104135 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.032 104135 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.032 104135 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.032 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.033 104135 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.034 104135 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.035 104135 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.036 104135 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.037 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.038 104135 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.039 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.040 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.041 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.042 104135 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.043 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.044 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.045 104135 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.046 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.047 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.048 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.049 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.050 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.051 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.052 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.053 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.054 104135 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.055 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.056 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.057 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.058 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.059 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.060 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.061 104135 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.062 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.063 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.064 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.065 104135 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.066 104135 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.127 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.127 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.127 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.128 104135 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.128 104135 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.140 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e130c2ec-fef7-4ed2-892d-1e3d7eaab401 (UUID: e130c2ec-fef7-4ed2-892d-1e3d7eaab401) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.165 104135 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.165 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.165 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.165 104135 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.169 104135 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.177 104135 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.182 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e130c2ec-fef7-4ed2-892d-1e3d7eaab401'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], external_ids={}, name=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, nb_cfg_timestamp=1769119209228, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.183 104135 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f1308164b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.184 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.184 104135 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.184 104135 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.184 104135 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.188 104135 DEBUG oslo_service.service [-] Started child 104394 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.191 104135 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8dtl0f58/privsep.sock']#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.191 104394 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-176272'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.220 104394 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.221 104394 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.221 104394 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.225 104394 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.233 104394 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 22 17:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.240 104394 INFO eventlet.wsgi.server [-] (104394) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 22 17:01:12 np0005592767 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 22 17:01:13 np0005592767 python3.9[104527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.012 104135 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.013 104135 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8dtl0f58/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.803 104518 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.808 104518 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.810 104518 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:12.811 104518 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104518#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.017 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[974da216-defd-40e0-b69c-c7e556c3ab56]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.618 104518 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.618 104518 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:01:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:13.618 104518 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:01:13 np0005592767 python3.9[104656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119272.5600338-1429-136767719629626/.source.yaml _original_basename=.jk0txy5e follow=False checksum=e950e86bcd2f4ea172cbb26957801b4267cffda4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.227 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d36ac510-849d-40cb-8b52-c9d3b94f3bc8]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.230 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, column=external_ids, values=({'neutron:ovn-metadata-id': '8aec5dcc-a6a3-5e9e-8c53-ecf90e1ba80d'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.348 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.356 104135 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.356 104135 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.356 104135 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.356 104135 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.356 104135 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.357 104135 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.357 104135 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.357 104135 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.357 104135 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.357 104135 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.358 104135 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.358 104135 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.358 104135 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.358 104135 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.358 104135 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.359 104135 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.360 104135 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.360 104135 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.360 104135 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.360 104135 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.360 104135 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.361 104135 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.361 104135 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.361 104135 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.361 104135 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.361 104135 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.362 104135 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.362 104135 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.362 104135 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.362 104135 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.362 104135 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.363 104135 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.363 104135 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.363 104135 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.363 104135 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.363 104135 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.364 104135 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.365 104135 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.366 104135 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.367 104135 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.368 104135 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.369 104135 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.370 104135 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.371 104135 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.372 104135 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.373 104135 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.374 104135 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.375 104135 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.375 104135 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.375 104135 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.375 104135 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.375 104135 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.376 104135 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.377 104135 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.377 104135 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.377 104135 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.377 104135 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.377 104135 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.378 104135 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.379 104135 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.379 104135 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.379 104135 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.379 104135 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.379 104135 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.380 104135 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.381 104135 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.382 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.383 104135 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.384 104135 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.385 104135 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.386 104135 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.387 104135 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.388 104135 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.389 104135 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.390 104135 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.391 104135 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.392 104135 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.393 104135 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.394 104135 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.395 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.396 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.397 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.398 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.399 104135 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.400 104135 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.400 104135 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:01:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:01:14.400 104135 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 17:01:14 np0005592767 systemd[1]: session-22.scope: Deactivated successfully.
Jan 22 17:01:14 np0005592767 systemd[1]: session-22.scope: Consumed 51.395s CPU time.
Jan 22 17:01:14 np0005592767 systemd-logind[802]: Session 22 logged out. Waiting for processes to exit.
Jan 22 17:01:14 np0005592767 systemd-logind[802]: Removed session 22.
Jan 22 17:01:21 np0005592767 systemd-logind[802]: New session 23 of user zuul.
Jan 22 17:01:21 np0005592767 systemd[1]: Started Session 23 of User zuul.
Jan 22 17:01:22 np0005592767 python3.9[104834]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:01:27 np0005592767 python3.9[104990]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:01:29 np0005592767 python3.9[105164]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:01:29 np0005592767 systemd[1]: Reloading.
Jan 22 17:01:29 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:01:29 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:01:31 np0005592767 python3.9[105348]: ansible-ansible.builtin.service_facts Invoked
Jan 22 17:01:31 np0005592767 network[105365]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 17:01:31 np0005592767 network[105366]: 'network-scripts' will be removed from distribution in near future.
Jan 22 17:01:31 np0005592767 network[105367]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 17:01:33 np0005592767 podman[105465]: 2026-01-22 22:01:33.980959653 +0000 UTC m=+0.084857673 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 22 17:01:38 np0005592767 python3.9[105651]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:39 np0005592767 python3.9[105804]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:40 np0005592767 python3.9[105957]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:40 np0005592767 podman[106082]: 2026-01-22 22:01:40.540087466 +0000 UTC m=+0.053014979 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:01:40 np0005592767 python3.9[106128]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:41 np0005592767 python3.9[106281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:42 np0005592767 python3.9[106434]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:43 np0005592767 python3.9[106587]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:01:45 np0005592767 python3.9[106740]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:46 np0005592767 python3.9[106892]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:46 np0005592767 python3.9[107044]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:47 np0005592767 python3.9[107196]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:47 np0005592767 python3.9[107348]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:48 np0005592767 python3.9[107500]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:49 np0005592767 python3.9[107652]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:50 np0005592767 python3.9[107804]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:51 np0005592767 python3.9[107956]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:52 np0005592767 python3.9[108108]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:52 np0005592767 python3.9[108260]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:53 np0005592767 python3.9[108412]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:54 np0005592767 python3.9[108564]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:54 np0005592767 python3.9[108716]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:01:56 np0005592767 python3.9[108868]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:01:57 np0005592767 python3.9[109021]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 17:01:58 np0005592767 python3.9[109173]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:01:58 np0005592767 systemd[1]: Reloading.
Jan 22 17:01:58 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:01:58 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:01:59 np0005592767 python3.9[109360]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:00 np0005592767 python3.9[109513]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:00 np0005592767 python3.9[109666]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:01 np0005592767 python3.9[109819]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:01 np0005592767 python3.9[109972]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:02 np0005592767 python3.9[110125]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:03 np0005592767 python3.9[110278]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:02:04 np0005592767 podman[110304]: 2026-01-22 22:02:04.191905088 +0000 UTC m=+0.103508967 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:02:04 np0005592767 python3.9[110458]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 22 17:02:05 np0005592767 python3.9[110611]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 17:02:06 np0005592767 python3.9[110769]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 17:02:08 np0005592767 python3.9[110929]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 17:02:08 np0005592767 python3.9[111013]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 17:02:11 np0005592767 podman[111021]: 2026-01-22 22:02:11.165057007 +0000 UTC m=+0.078597152 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:02:12.068 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:02:12.070 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:02:12.070 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:02:35 np0005592767 podman[111225]: 2026-01-22 22:02:35.237497777 +0000 UTC m=+0.133689709 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:02:37 np0005592767 kernel: SELinux:  Converting 2763 SID table entries...
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 17:02:37 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 17:02:42 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 22 17:02:42 np0005592767 podman[111261]: 2026-01-22 22:02:42.17332103 +0000 UTC m=+0.068523218 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:02:48 np0005592767 kernel: SELinux:  Converting 2763 SID table entries...
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 17:02:48 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 17:03:06 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 22 17:03:06 np0005592767 podman[114225]: 2026-01-22 22:03:06.179654411 +0000 UTC m=+0.087025421 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:03:12.070 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:03:12.071 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:03:12.072 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:03:13 np0005592767 podman[118956]: 2026-01-22 22:03:13.142049215 +0000 UTC m=+0.056367475 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:03:37 np0005592767 podman[128207]: 2026-01-22 22:03:37.180667709 +0000 UTC m=+0.099032790 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 17:03:39 np0005592767 kernel: SELinux:  Converting 2764 SID table entries...
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability network_peer_controls=1
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability open_perms=1
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability extended_socket_class=1
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability always_check_network=0
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 22 17:03:39 np0005592767 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 22 17:03:43 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 17:03:43 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 22 17:03:43 np0005592767 dbus-broker-launch[778]: Noticed file-system modification, trigger reload.
Jan 22 17:03:44 np0005592767 podman[128265]: 2026-01-22 22:03:44.081148663 +0000 UTC m=+0.061601992 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:03:51 np0005592767 systemd[1]: Stopping OpenSSH server daemon...
Jan 22 17:03:51 np0005592767 systemd[1]: sshd.service: Deactivated successfully.
Jan 22 17:03:51 np0005592767 systemd[1]: Stopped OpenSSH server daemon.
Jan 22 17:03:51 np0005592767 systemd[1]: sshd.service: Consumed 2.074s CPU time, read 32.0K from disk, written 44.0K to disk.
Jan 22 17:03:51 np0005592767 systemd[1]: Stopped target sshd-keygen.target.
Jan 22 17:03:51 np0005592767 systemd[1]: Stopping sshd-keygen.target...
Jan 22 17:03:51 np0005592767 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 17:03:51 np0005592767 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 17:03:51 np0005592767 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 22 17:03:51 np0005592767 systemd[1]: Reached target sshd-keygen.target.
Jan 22 17:03:51 np0005592767 systemd[1]: Starting OpenSSH server daemon...
Jan 22 17:03:51 np0005592767 systemd[1]: Started OpenSSH server daemon.
Jan 22 17:03:52 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 17:03:52 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 17:03:52 np0005592767 systemd[1]: Reloading.
Jan 22 17:03:52 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:03:52 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:03:53 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 17:04:00 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 17:04:00 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 17:04:00 np0005592767 systemd[1]: man-db-cache-update.service: Consumed 9.795s CPU time.
Jan 22 17:04:00 np0005592767 systemd[1]: run-rdbec5a2e4ec34cecb75d9bd428c6ecd4.service: Deactivated successfully.
Jan 22 17:04:08 np0005592767 podman[137676]: 2026-01-22 22:04:08.170076973 +0000 UTC m=+0.090857779 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:04:10 np0005592767 python3.9[137828]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:04:10 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:10 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:10 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:11 np0005592767 python3.9[138017]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:04:11 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:12 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:12 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:04:12.071 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:04:12.073 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:04:12.074 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:04:13 np0005592767 python3.9[138208]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:04:13 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:13 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:13 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:14 np0005592767 podman[138370]: 2026-01-22 22:04:14.403848587 +0000 UTC m=+0.064997084 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 17:04:14 np0005592767 python3.9[138418]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:04:14 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:14 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:14 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:15 np0005592767 python3.9[138610]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:15 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:15 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:15 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:17 np0005592767 python3.9[138800]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:17 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:17 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:17 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:18 np0005592767 python3.9[138991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:18 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:18 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:18 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:19 np0005592767 python3.9[139181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:20 np0005592767 python3.9[139336]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:20 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:20 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:20 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:22 np0005592767 python3.9[139526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 22 17:04:22 np0005592767 systemd[1]: Reloading.
Jan 22 17:04:22 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:04:22 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:04:22 np0005592767 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 22 17:04:22 np0005592767 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 22 17:04:23 np0005592767 python3.9[139720]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:24 np0005592767 python3.9[139875]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:26 np0005592767 python3.9[140030]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:27 np0005592767 python3.9[140185]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:28 np0005592767 python3.9[140340]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:28 np0005592767 python3.9[140495]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:29 np0005592767 python3.9[140650]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:30 np0005592767 python3.9[140805]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:31 np0005592767 python3.9[140960]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:32 np0005592767 python3.9[141115]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:32 np0005592767 python3.9[141270]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:33 np0005592767 python3.9[141425]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:34 np0005592767 python3.9[141580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:35 np0005592767 python3.9[141735]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 22 17:04:39 np0005592767 podman[141765]: 2026-01-22 22:04:39.016216875 +0000 UTC m=+0.144531843 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 22 17:04:39 np0005592767 python3.9[141920]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:40 np0005592767 python3.9[142072]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:40 np0005592767 python3.9[142224]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:41 np0005592767 python3.9[142376]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:42 np0005592767 python3.9[142528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:42 np0005592767 python3.9[142680]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:04:43 np0005592767 python3.9[142830]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:04:44 np0005592767 podman[142954]: 2026-01-22 22:04:44.626178426 +0000 UTC m=+0.059223592 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 17:04:44 np0005592767 python3.9[142998]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:45 np0005592767 python3.9[143126]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119484.2195208-1649-245467635297678/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:46 np0005592767 python3.9[143278]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:47 np0005592767 python3.9[143403]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119485.6731608-1649-239199791149519/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:48 np0005592767 python3.9[143555]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:48 np0005592767 python3.9[143680]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119487.59268-1649-57475647661886/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:49 np0005592767 python3.9[143832]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:49 np0005592767 python3.9[143957]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119488.8062444-1649-6606352730006/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:50 np0005592767 python3.9[144109]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:50 np0005592767 python3.9[144234]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119489.9635603-1649-154655354452467/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:51 np0005592767 python3.9[144386]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:52 np0005592767 python3.9[144511]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119491.1445973-1649-232358121586774/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:52 np0005592767 python3.9[144663]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:53 np0005592767 python3.9[144786]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119492.3173823-1649-33043104788101/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:53 np0005592767 python3.9[144938]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:04:54 np0005592767 python3.9[145063]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769119493.5117724-1649-278109090351111/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:56 np0005592767 python3.9[145215]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 22 17:04:57 np0005592767 python3.9[145368]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:58 np0005592767 python3.9[145520]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:04:59 np0005592767 python3.9[145672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:00 np0005592767 python3.9[145824]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:00 np0005592767 python3.9[145976]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:01 np0005592767 python3.9[146128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:02 np0005592767 python3.9[146280]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:03 np0005592767 python3.9[146432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:03 np0005592767 python3.9[146584]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:04 np0005592767 python3.9[146736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:05 np0005592767 python3.9[146888]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:05 np0005592767 python3.9[147040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:06 np0005592767 python3.9[147192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:06 np0005592767 python3.9[147344]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:08 np0005592767 python3.9[147496]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:09 np0005592767 podman[147505]: 2026-01-22 22:05:09.249847259 +0000 UTC m=+0.147915549 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:05:09 np0005592767 python3.9[147643]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119508.5103247-2312-22459824623990/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:10 np0005592767 python3.9[147795]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:10 np0005592767 python3.9[147918]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119509.9341125-2312-106981842667030/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:11 np0005592767 python3.9[148070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:05:12.073 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:05:12.074 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:05:12.074 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:05:12 np0005592767 python3.9[148193]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119511.064795-2312-132917492492806/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:12 np0005592767 python3.9[148345]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:13 np0005592767 python3.9[148468]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119512.2868054-2312-69738186233845/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:13 np0005592767 python3.9[148620]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:14 np0005592767 python3.9[148743]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119513.5038042-2312-61928690039328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:15 np0005592767 podman[148867]: 2026-01-22 22:05:15.028106877 +0000 UTC m=+0.061891877 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:05:15 np0005592767 python3.9[148912]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:15 np0005592767 python3.9[149036]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119514.7103016-2312-135016295157402/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:16 np0005592767 python3.9[149188]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:16 np0005592767 python3.9[149311]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119515.9000685-2312-163952094552586/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:17 np0005592767 python3.9[149463]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:18 np0005592767 python3.9[149586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119517.0211065-2312-171351401464785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:18 np0005592767 python3.9[149738]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:19 np0005592767 python3.9[149861]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119518.1734543-2312-24840517998633/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:19 np0005592767 python3.9[150013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:20 np0005592767 python3.9[150136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119519.3794756-2312-239506844232127/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:21 np0005592767 python3.9[150288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:21 np0005592767 python3.9[150411]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119520.614249-2312-71784250151052/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:22 np0005592767 python3.9[150563]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:22 np0005592767 python3.9[150686]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119521.9136825-2312-81654678868926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:23 np0005592767 python3.9[150838]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:24 np0005592767 python3.9[150961]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119523.1315227-2312-164923837394262/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:24 np0005592767 python3.9[151113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:25 np0005592767 python3.9[151236]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119524.3453302-2312-77083012172305/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:28 np0005592767 python3.9[151386]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:05:29 np0005592767 python3.9[151541]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 22 17:05:31 np0005592767 dbus-broker-launch[779]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 22 17:05:31 np0005592767 python3.9[151697]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:32 np0005592767 python3.9[151849]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:32 np0005592767 python3.9[152001]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:33 np0005592767 python3.9[152153]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:33 np0005592767 python3.9[152305]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:35 np0005592767 python3.9[152457]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:36 np0005592767 python3.9[152609]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:36 np0005592767 python3.9[152761]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:37 np0005592767 python3.9[152913]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:37 np0005592767 python3.9[153065]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:39 np0005592767 python3.9[153217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:05:39 np0005592767 systemd[1]: Reloading.
Jan 22 17:05:39 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:05:39 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:05:39 np0005592767 podman[153219]: 2026-01-22 22:05:39.682701609 +0000 UTC m=+0.171901779 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:05:39 np0005592767 systemd[1]: Starting libvirt logging daemon socket...
Jan 22 17:05:39 np0005592767 systemd[1]: Listening on libvirt logging daemon socket.
Jan 22 17:05:39 np0005592767 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 22 17:05:39 np0005592767 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 22 17:05:39 np0005592767 systemd[1]: Starting libvirt logging daemon...
Jan 22 17:05:39 np0005592767 systemd[1]: Started libvirt logging daemon.
Jan 22 17:05:40 np0005592767 python3.9[153435]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:05:40 np0005592767 systemd[1]: Reloading.
Jan 22 17:05:40 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:05:40 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:05:40 np0005592767 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 22 17:05:40 np0005592767 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 22 17:05:40 np0005592767 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 22 17:05:40 np0005592767 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 22 17:05:40 np0005592767 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 22 17:05:40 np0005592767 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 22 17:05:41 np0005592767 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 17:05:41 np0005592767 systemd[1]: Started libvirt nodedev daemon.
Jan 22 17:05:41 np0005592767 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 22 17:05:41 np0005592767 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 22 17:05:41 np0005592767 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 22 17:05:41 np0005592767 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 22 17:05:41 np0005592767 python3.9[153652]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:05:41 np0005592767 systemd[1]: Reloading.
Jan 22 17:05:41 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:05:41 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:05:41 np0005592767 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 22 17:05:41 np0005592767 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 22 17:05:41 np0005592767 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 22 17:05:41 np0005592767 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 22 17:05:41 np0005592767 systemd[1]: Starting libvirt proxy daemon...
Jan 22 17:05:42 np0005592767 systemd[1]: Started libvirt proxy daemon.
Jan 22 17:05:42 np0005592767 setroubleshoot[153498]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6df4a81-bf0d-4178-a358-5c926c05b791
Jan 22 17:05:42 np0005592767 setroubleshoot[153498]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 22 17:05:42 np0005592767 setroubleshoot[153498]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6df4a81-bf0d-4178-a358-5c926c05b791
Jan 22 17:05:42 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:05:42 np0005592767 setroubleshoot[153498]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 22 17:05:42 np0005592767 python3.9[153872]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:05:42 np0005592767 systemd[1]: Reloading.
Jan 22 17:05:42 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:05:42 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:05:43 np0005592767 systemd[1]: Listening on libvirt locking daemon socket.
Jan 22 17:05:43 np0005592767 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 22 17:05:43 np0005592767 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 22 17:05:43 np0005592767 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 22 17:05:43 np0005592767 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 22 17:05:43 np0005592767 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 22 17:05:43 np0005592767 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 22 17:05:43 np0005592767 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 22 17:05:43 np0005592767 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 22 17:05:43 np0005592767 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 22 17:05:43 np0005592767 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 17:05:43 np0005592767 systemd[1]: Started libvirt QEMU daemon.
Jan 22 17:05:43 np0005592767 python3.9[154087]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:05:43 np0005592767 systemd[1]: Reloading.
Jan 22 17:05:43 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:05:43 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:05:44 np0005592767 systemd[1]: Starting libvirt secret daemon socket...
Jan 22 17:05:44 np0005592767 systemd[1]: Listening on libvirt secret daemon socket.
Jan 22 17:05:44 np0005592767 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 22 17:05:44 np0005592767 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 22 17:05:44 np0005592767 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 22 17:05:44 np0005592767 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 22 17:05:44 np0005592767 systemd[1]: Starting libvirt secret daemon...
Jan 22 17:05:44 np0005592767 systemd[1]: Started libvirt secret daemon.
Jan 22 17:05:45 np0005592767 podman[154172]: 2026-01-22 22:05:45.144594751 +0000 UTC m=+0.064918823 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:05:45 np0005592767 python3.9[154318]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:46 np0005592767 python3.9[154470]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 17:05:47 np0005592767 python3.9[154622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:48 np0005592767 python3.9[154745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119547.2187953-3348-235771802667706/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:49 np0005592767 python3.9[154897]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:49 np0005592767 python3.9[155049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:50 np0005592767 python3.9[155127]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:51 np0005592767 python3.9[155279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:51 np0005592767 python3.9[155357]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xp0c4lvj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:52 np0005592767 python3.9[155509]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:52 np0005592767 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 22 17:05:52 np0005592767 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 22 17:05:52 np0005592767 python3.9[155587]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:53 np0005592767 python3.9[155739]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:05:54 np0005592767 python3[155892]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 22 17:05:55 np0005592767 python3.9[156044]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:55 np0005592767 python3.9[156122]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:56 np0005592767 python3.9[156274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:57 np0005592767 python3.9[156399]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119555.91443-3615-47969651063897/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:57 np0005592767 python3.9[156551]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:58 np0005592767 python3.9[156629]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:05:58 np0005592767 python3.9[156781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:05:59 np0005592767 python3.9[156859]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:00 np0005592767 python3.9[157011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:06:01 np0005592767 python3.9[157136]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769119560.0486536-3731-34302882943563/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:01 np0005592767 python3.9[157288]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:02 np0005592767 python3.9[157440]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:06:03 np0005592767 python3.9[157595]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:04 np0005592767 python3.9[157747]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:06:05 np0005592767 python3.9[157900]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:06:06 np0005592767 python3.9[158054]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:06:06 np0005592767 python3.9[158209]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:07 np0005592767 python3.9[158361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:06:08 np0005592767 python3.9[158484]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119567.0779498-3947-53718377107990/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:08 np0005592767 python3.9[158636]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:06:09 np0005592767 python3.9[158759]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119568.521426-3992-53344692464668/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:10 np0005592767 podman[158807]: 2026-01-22 22:06:10.186796738 +0000 UTC m=+0.104597551 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:06:10 np0005592767 python3.9[158937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:06:11 np0005592767 python3.9[159060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119570.0526524-4037-171651971991465/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:06:12.074 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:06:12.075 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:06:12.075 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:06:12 np0005592767 python3.9[159212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:06:12 np0005592767 systemd[1]: Reloading.
Jan 22 17:06:12 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:06:12 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:06:12 np0005592767 systemd[1]: Reached target edpm_libvirt.target.
Jan 22 17:06:13 np0005592767 python3.9[159402]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 22 17:06:13 np0005592767 systemd[1]: Reloading.
Jan 22 17:06:13 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:06:13 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:06:13 np0005592767 systemd[1]: Reloading.
Jan 22 17:06:13 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:06:13 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:06:14 np0005592767 systemd[1]: session-23.scope: Deactivated successfully.
Jan 22 17:06:14 np0005592767 systemd[1]: session-23.scope: Consumed 3min 18.269s CPU time.
Jan 22 17:06:14 np0005592767 systemd-logind[802]: Session 23 logged out. Waiting for processes to exit.
Jan 22 17:06:14 np0005592767 systemd-logind[802]: Removed session 23.
Jan 22 17:06:16 np0005592767 podman[159498]: 2026-01-22 22:06:16.137316947 +0000 UTC m=+0.056345336 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 22 17:06:19 np0005592767 systemd-logind[802]: New session 24 of user zuul.
Jan 22 17:06:19 np0005592767 systemd[1]: Started Session 24 of User zuul.
Jan 22 17:06:20 np0005592767 python3.9[159674]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:06:21 np0005592767 python3.9[159828]: ansible-ansible.builtin.service_facts Invoked
Jan 22 17:06:22 np0005592767 network[159845]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 17:06:22 np0005592767 network[159846]: 'network-scripts' will be removed from distribution in near future.
Jan 22 17:06:22 np0005592767 network[159847]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 17:06:28 np0005592767 python3.9[160118]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 22 17:06:29 np0005592767 python3.9[160202]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 17:06:36 np0005592767 python3.9[160355]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:06:37 np0005592767 python3.9[160507]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:06:38 np0005592767 python3.9[160660]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:06:38 np0005592767 python3.9[160812]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:06:40 np0005592767 podman[160937]: 2026-01-22 22:06:40.42500503 +0000 UTC m=+0.104120218 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:06:40 np0005592767 python3.9[160980]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:06:41 np0005592767 python3.9[161115]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119600.101459-248-69513180383546/.source.iscsi _original_basename=.3tcb1qyp follow=False checksum=f86d1b7218052bec8b10b7ea8ed1196dd04f3141 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:42 np0005592767 python3.9[161267]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:43 np0005592767 python3.9[161419]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:06:44 np0005592767 python3.9[161571]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:06:44 np0005592767 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 22 17:06:46 np0005592767 python3.9[161727]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:06:46 np0005592767 systemd[1]: Reloading.
Jan 22 17:06:46 np0005592767 podman[161729]: 2026-01-22 22:06:46.356759834 +0000 UTC m=+0.058175918 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:06:46 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:06:46 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:06:46 np0005592767 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 17:06:46 np0005592767 systemd[1]: Starting Open-iSCSI...
Jan 22 17:06:46 np0005592767 kernel: Loading iSCSI transport class v2.0-870.
Jan 22 17:06:46 np0005592767 systemd[1]: Started Open-iSCSI.
Jan 22 17:06:46 np0005592767 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 22 17:06:46 np0005592767 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 22 17:06:47 np0005592767 python3.9[161944]: ansible-ansible.builtin.service_facts Invoked
Jan 22 17:06:47 np0005592767 network[161961]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 17:06:47 np0005592767 network[161962]: 'network-scripts' will be removed from distribution in near future.
Jan 22 17:06:47 np0005592767 network[161963]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 17:06:54 np0005592767 python3.9[162234]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 17:06:56 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 17:06:56 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 17:06:56 np0005592767 systemd[1]: Reloading.
Jan 22 17:06:56 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:06:56 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:06:56 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 17:06:59 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 17:06:59 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 17:06:59 np0005592767 systemd[1]: run-r4ec56ba48b684440815f0225167719a5.service: Deactivated successfully.
Jan 22 17:07:00 np0005592767 python3.9[162553]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 17:07:01 np0005592767 python3.9[162705]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 22 17:07:01 np0005592767 python3.9[162861]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:07:02 np0005592767 python3.9[162984]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119621.4055462-512-33678355485236/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:03 np0005592767 python3.9[163136]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:04 np0005592767 python3.9[163288]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:07:04 np0005592767 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 17:07:04 np0005592767 systemd[1]: Stopped Load Kernel Modules.
Jan 22 17:07:04 np0005592767 systemd[1]: Stopping Load Kernel Modules...
Jan 22 17:07:04 np0005592767 systemd[1]: Starting Load Kernel Modules...
Jan 22 17:07:04 np0005592767 systemd[1]: Finished Load Kernel Modules.
Jan 22 17:07:05 np0005592767 python3.9[163444]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:07:06 np0005592767 python3.9[163597]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:07:06 np0005592767 python3.9[163749]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:07:07 np0005592767 python3.9[163872]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119626.4110436-665-207692240074983/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:08 np0005592767 python3.9[164024]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:07:09 np0005592767 python3.9[164177]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:10 np0005592767 python3.9[164329]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:10 np0005592767 podman[164453]: 2026-01-22 22:07:10.860102642 +0000 UTC m=+0.126420280 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 17:07:10 np0005592767 python3.9[164494]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:11 np0005592767 python3.9[164659]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:07:12.075 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:07:12.075 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:07:12.076 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:07:12 np0005592767 python3.9[164811]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:13 np0005592767 python3.9[164963]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:13 np0005592767 python3.9[165115]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:14 np0005592767 python3.9[165267]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:07:15 np0005592767 python3.9[165421]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:07:16 np0005592767 python3.9[165574]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:16 np0005592767 systemd[1]: Listening on multipathd control socket.
Jan 22 17:07:16 np0005592767 podman[165579]: 2026-01-22 22:07:16.669014561 +0000 UTC m=+0.057429245 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:07:17 np0005592767 python3.9[165750]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:17 np0005592767 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 22 17:07:17 np0005592767 udevadm[165755]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 22 17:07:17 np0005592767 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 22 17:07:17 np0005592767 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 17:07:17 np0005592767 multipathd[165758]: --------start up--------
Jan 22 17:07:17 np0005592767 multipathd[165758]: read /etc/multipath.conf
Jan 22 17:07:17 np0005592767 multipathd[165758]: path checkers start up
Jan 22 17:07:17 np0005592767 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 17:07:18 np0005592767 python3.9[165917]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 22 17:07:19 np0005592767 python3.9[166069]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 22 17:07:19 np0005592767 kernel: Key type psk registered
Jan 22 17:07:20 np0005592767 python3.9[166231]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:07:20 np0005592767 python3.9[166354]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119640.0200977-1055-42019533640522/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:22 np0005592767 python3.9[166506]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:22 np0005592767 python3.9[166658]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:07:22 np0005592767 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 22 17:07:22 np0005592767 systemd[1]: Stopped Load Kernel Modules.
Jan 22 17:07:22 np0005592767 systemd[1]: Stopping Load Kernel Modules...
Jan 22 17:07:22 np0005592767 systemd[1]: Starting Load Kernel Modules...
Jan 22 17:07:22 np0005592767 systemd[1]: Finished Load Kernel Modules.
Jan 22 17:07:23 np0005592767 python3.9[166814]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 22 17:07:26 np0005592767 systemd[1]: Reloading.
Jan 22 17:07:26 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:07:26 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:07:26 np0005592767 systemd[1]: Reloading.
Jan 22 17:07:26 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:07:26 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:07:27 np0005592767 systemd-logind[802]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 22 17:07:27 np0005592767 systemd-logind[802]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 22 17:07:27 np0005592767 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 22 17:07:27 np0005592767 systemd[1]: Starting man-db-cache-update.service...
Jan 22 17:07:27 np0005592767 systemd[1]: Reloading.
Jan 22 17:07:27 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:07:27 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:07:27 np0005592767 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 22 17:07:28 np0005592767 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 22 17:07:28 np0005592767 systemd[1]: Finished man-db-cache-update.service.
Jan 22 17:07:28 np0005592767 systemd[1]: man-db-cache-update.service: Consumed 1.334s CPU time.
Jan 22 17:07:28 np0005592767 systemd[1]: run-r049c0b3900cc43369a80f7b096a55468.service: Deactivated successfully.
Jan 22 17:07:29 np0005592767 python3.9[168278]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:07:29 np0005592767 systemd[1]: Stopping Open-iSCSI...
Jan 22 17:07:29 np0005592767 iscsid[161785]: iscsid shutting down.
Jan 22 17:07:29 np0005592767 systemd[1]: iscsid.service: Deactivated successfully.
Jan 22 17:07:29 np0005592767 systemd[1]: Stopped Open-iSCSI.
Jan 22 17:07:29 np0005592767 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 22 17:07:29 np0005592767 systemd[1]: Starting Open-iSCSI...
Jan 22 17:07:29 np0005592767 systemd[1]: Started Open-iSCSI.
Jan 22 17:07:30 np0005592767 python3.9[168434]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:07:30 np0005592767 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 22 17:07:30 np0005592767 multipathd[165758]: exit (signal)
Jan 22 17:07:30 np0005592767 multipathd[165758]: --------shut down-------
Jan 22 17:07:30 np0005592767 systemd[1]: multipathd.service: Deactivated successfully.
Jan 22 17:07:30 np0005592767 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 22 17:07:30 np0005592767 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 22 17:07:30 np0005592767 multipathd[168440]: --------start up--------
Jan 22 17:07:30 np0005592767 multipathd[168440]: read /etc/multipath.conf
Jan 22 17:07:30 np0005592767 multipathd[168440]: path checkers start up
Jan 22 17:07:30 np0005592767 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 22 17:07:31 np0005592767 python3.9[168597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:07:32 np0005592767 python3.9[168753]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:33 np0005592767 python3.9[168905]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:07:33 np0005592767 systemd[1]: Reloading.
Jan 22 17:07:33 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:07:33 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:07:34 np0005592767 python3.9[169090]: ansible-ansible.builtin.service_facts Invoked
Jan 22 17:07:34 np0005592767 network[169107]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 17:07:34 np0005592767 network[169108]: 'network-scripts' will be removed from distribution in near future.
Jan 22 17:07:34 np0005592767 network[169109]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 17:07:39 np0005592767 python3.9[169381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:39 np0005592767 python3.9[169534]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:40 np0005592767 python3.9[169687]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:41 np0005592767 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 22 17:07:41 np0005592767 podman[169812]: 2026-01-22 22:07:41.076358323 +0000 UTC m=+0.100748319 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:07:41 np0005592767 python3.9[169859]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:42 np0005592767 python3.9[170022]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:42 np0005592767 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 17:07:42 np0005592767 python3.9[170176]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:43 np0005592767 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 22 17:07:43 np0005592767 python3.9[170329]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:44 np0005592767 python3.9[170483]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:07:44 np0005592767 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 22 17:07:45 np0005592767 python3.9[170637]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:46 np0005592767 python3.9[170789]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:46 np0005592767 podman[170941]: 2026-01-22 22:07:46.825623866 +0000 UTC m=+0.083184509 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:07:46 np0005592767 python3.9[170942]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:47 np0005592767 python3.9[171112]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:48 np0005592767 python3.9[171264]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:48 np0005592767 python3.9[171416]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:49 np0005592767 python3.9[171568]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:49 np0005592767 python3.9[171720]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:51 np0005592767 python3.9[171872]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:51 np0005592767 python3.9[172024]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:52 np0005592767 python3.9[172176]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:52 np0005592767 python3.9[172328]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:53 np0005592767 python3.9[172480]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:54 np0005592767 python3.9[172632]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:54 np0005592767 python3.9[172784]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:55 np0005592767 python3.9[172936]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:07:56 np0005592767 python3.9[173088]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:07:57 np0005592767 python3.9[173240]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 17:07:58 np0005592767 python3.9[173392]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:07:58 np0005592767 systemd[1]: Reloading.
Jan 22 17:07:58 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:07:58 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:07:59 np0005592767 python3.9[173578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:00 np0005592767 python3.9[173731]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:00 np0005592767 python3.9[173884]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:01 np0005592767 python3.9[174037]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:02 np0005592767 python3.9[174190]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:02 np0005592767 python3.9[174343]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:03 np0005592767 python3.9[174496]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:03 np0005592767 python3.9[174649]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:08:06 np0005592767 python3.9[174802]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:07 np0005592767 python3.9[174954]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:08 np0005592767 python3.9[175106]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:08 np0005592767 python3.9[175258]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:09 np0005592767 python3.9[175410]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:10 np0005592767 python3.9[175562]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:10 np0005592767 python3.9[175714]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:11 np0005592767 podman[175866]: 2026-01-22 22:08:11.245893001 +0000 UTC m=+0.088142665 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Jan 22 17:08:11 np0005592767 python3.9[175867]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:11 np0005592767 python3.9[176044]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:08:12.076 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:08:12.077 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:08:12.077 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:08:12 np0005592767 python3.9[176196]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:17 np0005592767 podman[176221]: 2026-01-22 22:08:17.168731254 +0000 UTC m=+0.078746478 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:08:18 np0005592767 python3.9[176368]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 22 17:08:19 np0005592767 python3.9[176521]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 17:08:20 np0005592767 python3.9[176679]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 17:08:22 np0005592767 systemd-logind[802]: New session 25 of user zuul.
Jan 22 17:08:22 np0005592767 systemd[1]: Started Session 25 of User zuul.
Jan 22 17:08:22 np0005592767 systemd[1]: session-25.scope: Deactivated successfully.
Jan 22 17:08:22 np0005592767 systemd-logind[802]: Session 25 logged out. Waiting for processes to exit.
Jan 22 17:08:22 np0005592767 systemd-logind[802]: Removed session 25.
Jan 22 17:08:23 np0005592767 python3.9[176865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:23 np0005592767 python3.9[176986]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119702.469884-2641-44387594316179/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:24 np0005592767 python3.9[177136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:24 np0005592767 python3.9[177212]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:25 np0005592767 python3.9[177362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:25 np0005592767 python3.9[177483]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119704.7237697-2641-57344198156013/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:26 np0005592767 python3.9[177633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:26 np0005592767 python3.9[177754]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119705.780852-2641-95066713852482/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:27 np0005592767 python3.9[177904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:28 np0005592767 python3.9[178025]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119706.8652983-2641-6498842119959/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:28 np0005592767 python3.9[178175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:29 np0005592767 python3.9[178296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119708.227537-2641-27896902782681/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:30 np0005592767 python3.9[178448]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:08:31 np0005592767 python3.9[178600]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:08:32 np0005592767 python3.9[178752]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:33 np0005592767 python3.9[178904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:33 np0005592767 python3.9[179027]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769119712.6128418-2964-148758939854294/.source _original_basename=.5yz1rwm4 follow=False checksum=ffe97dc6acc49ec12c5a89603f5086ba9e785c37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 22 17:08:34 np0005592767 python3.9[179179]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:35 np0005592767 python3.9[179331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:36 np0005592767 python3.9[179452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119715.0902889-3041-119005707576050/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:36 np0005592767 python3.9[179602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:08:37 np0005592767 python3.9[179723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119716.3424363-3086-174900892323365/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:08:38 np0005592767 python3.9[179875]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 22 17:08:39 np0005592767 python3.9[180027]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 17:08:41 np0005592767 python3[180179]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 17:08:41 np0005592767 podman[180215]: 2026-01-22 22:08:41.214593974 +0000 UTC m=+0.048378415 container create 7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true)
Jan 22 17:08:41 np0005592767 podman[180215]: 2026-01-22 22:08:41.18627136 +0000 UTC m=+0.020055801 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 17:08:41 np0005592767 python3[180179]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 22 17:08:42 np0005592767 podman[180377]: 2026-01-22 22:08:42.041983955 +0000 UTC m=+0.109986515 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:08:42 np0005592767 python3.9[180422]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:43 np0005592767 python3.9[180586]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 22 17:08:44 np0005592767 python3.9[180738]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 17:08:45 np0005592767 python3[180890]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 17:08:45 np0005592767 podman[180925]: 2026-01-22 22:08:45.641743963 +0000 UTC m=+0.046881602 container create f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Jan 22 17:08:45 np0005592767 podman[180925]: 2026-01-22 22:08:45.615368034 +0000 UTC m=+0.020505723 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 22 17:08:45 np0005592767 python3[180890]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 22 17:08:46 np0005592767 python3.9[181115]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:47 np0005592767 podman[181241]: 2026-01-22 22:08:47.558983041 +0000 UTC m=+0.046620085 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 17:08:47 np0005592767 python3.9[181288]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:08:48 np0005592767 python3.9[181439]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119727.819239-3373-220071530833310/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:08:49 np0005592767 python3.9[181515]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:08:49 np0005592767 systemd[1]: Reloading.
Jan 22 17:08:49 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:08:49 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:08:49 np0005592767 python3.9[181626]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:08:49 np0005592767 systemd[1]: Reloading.
Jan 22 17:08:49 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:08:49 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:08:50 np0005592767 systemd[1]: Starting nova_compute container...
Jan 22 17:08:50 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:08:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:50 np0005592767 podman[181666]: 2026-01-22 22:08:50.279282998 +0000 UTC m=+0.096699927 container init f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:08:50 np0005592767 podman[181666]: 2026-01-22 22:08:50.294088079 +0000 UTC m=+0.111504998 container start f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:08:50 np0005592767 podman[181666]: nova_compute
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + sudo -E kolla_set_configs
Jan 22 17:08:50 np0005592767 systemd[1]: Started nova_compute container.
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Validating config file
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying service configuration files
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Deleting /etc/ceph
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Creating directory /etc/ceph
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Writing out command to execute
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:50 np0005592767 nova_compute[181681]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 17:08:50 np0005592767 nova_compute[181681]: ++ cat /run_command
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + CMD=nova-compute
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + ARGS=
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + sudo kolla_copy_cacerts
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + [[ ! -n '' ]]
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + . kolla_extend_start
Jan 22 17:08:50 np0005592767 nova_compute[181681]: Running command: 'nova-compute'
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + umask 0022
Jan 22 17:08:50 np0005592767 nova_compute[181681]: + exec nova-compute
Jan 22 17:08:52 np0005592767 python3.9[181843]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.376 181685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.377 181685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.377 181685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.377 181685 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.522 181685 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.534 181685 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:08:52 np0005592767 nova_compute[181681]: 2026-01-22 22:08:52.535 181685 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 17:08:52 np0005592767 python3.9[181997]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.103 181685 INFO nova.virt.driver [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.206 181685 INFO nova.compute.provider_config [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.225 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.225 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.225 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.226 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.226 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.226 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.226 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.226 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.227 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.228 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.229 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.230 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.231 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.232 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.233 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.234 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.235 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.236 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.237 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.238 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.239 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.240 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.241 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.242 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.243 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.244 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.245 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.246 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.247 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.248 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.249 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.250 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.251 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.252 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.253 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.254 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.255 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.256 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.257 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.258 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.259 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.260 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.261 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.262 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.263 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.264 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.265 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.266 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.267 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.268 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.268 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.268 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.268 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.268 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.269 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.270 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.271 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.272 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.273 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.274 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.275 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.276 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.277 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.277 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.277 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.277 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.277 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.278 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.279 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.280 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.281 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.282 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.283 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.284 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.285 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.286 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.287 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.288 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.288 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.288 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.288 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.288 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.289 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.290 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.291 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.291 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.291 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.291 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.291 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.292 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.292 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.292 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.292 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.292 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.293 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.294 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.295 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.296 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.297 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 WARNING oslo_config.cfg [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 17:08:53 np0005592767 nova_compute[181681]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 17:08:53 np0005592767 nova_compute[181681]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 17:08:53 np0005592767 nova_compute[181681]: and ``live_migration_inbound_addr`` respectively.
Jan 22 17:08:53 np0005592767 nova_compute[181681]: ).  Its value may be silently ignored in the future.#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.298 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.299 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.300 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.301 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.302 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.303 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.304 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.305 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.306 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.307 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.308 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.309 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.310 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.311 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.312 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.313 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.314 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.315 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.316 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.317 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.318 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.318 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.318 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.318 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.318 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.319 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.320 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.321 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.322 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.323 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.324 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.325 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.326 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.327 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.328 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.329 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.330 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.331 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.332 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.333 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.334 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.335 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.336 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.337 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.338 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.339 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.340 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.341 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.342 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.343 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.344 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.345 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.346 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.347 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.348 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.349 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.350 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.351 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.352 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.353 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.354 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.355 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.356 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.357 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.358 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.359 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.359 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.359 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.359 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.359 181685 DEBUG oslo_service.service [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.360 181685 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.373 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.374 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.374 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.374 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 22 17:08:53 np0005592767 systemd[1]: Starting libvirt QEMU daemon...
Jan 22 17:08:53 np0005592767 systemd[1]: Started libvirt QEMU daemon.
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.455 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1cca96fa60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.460 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1cca96fa60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.461 181685 INFO nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.482 181685 WARNING nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 22 17:08:53 np0005592767 nova_compute[181681]: 2026-01-22 22:08:53.482 181685 DEBUG nova.virt.libvirt.volume.mount [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 22 17:08:53 np0005592767 python3.9[182199]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.282 181685 INFO nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <host>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <uuid>094772d4-6a6e-4838-98c5-520e3f85ea8a</uuid>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <arch>x86_64</arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model>EPYC-Rome-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <vendor>AMD</vendor>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <microcode version='16777317'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <signature family='23' model='49' stepping='0'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='x2apic'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='tsc-deadline'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='osxsave'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='hypervisor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='tsc_adjust'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='spec-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='stibp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='arch-capabilities'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='cmp_legacy'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='topoext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='virt-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='lbrv'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='tsc-scale'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='vmcb-clean'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='pause-filter'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='pfthreshold'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='svme-addr-chk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='rdctl-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='skip-l1dfl-vmentry'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='mds-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature name='pschange-mc-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <pages unit='KiB' size='4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <pages unit='KiB' size='2048'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <pages unit='KiB' size='1048576'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <power_management>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <suspend_mem/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <suspend_disk/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <suspend_hybrid/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </power_management>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <iommu support='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <migration_features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <live/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <uri_transports>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <uri_transport>tcp</uri_transport>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <uri_transport>rdma</uri_transport>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </uri_transports>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </migration_features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <topology>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <cells num='1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <cell id='0'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <memory unit='KiB'>7864304</memory>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <pages unit='KiB' size='2048'>0</pages>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <distances>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <sibling id='0' value='10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          </distances>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          <cpus num='8'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:          </cpus>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        </cell>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </cells>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </topology>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <cache>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </cache>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <secmodel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model>selinux</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <doi>0</doi>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </secmodel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <secmodel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model>dac</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <doi>0</doi>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </secmodel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </host>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <guest>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <os_type>hvm</os_type>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <arch name='i686'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <wordsize>32</wordsize>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <domain type='qemu'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <domain type='kvm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <pae/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <nonpae/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <acpi default='on' toggle='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <apic default='on' toggle='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <cpuselection/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <deviceboot/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <disksnapshot default='on' toggle='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <externalSnapshot/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </guest>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <guest>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <os_type>hvm</os_type>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <arch name='x86_64'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <wordsize>64</wordsize>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <domain type='qemu'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <domain type='kvm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <acpi default='on' toggle='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <apic default='on' toggle='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <cpuselection/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <deviceboot/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <disksnapshot default='on' toggle='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <externalSnapshot/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </guest>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </capabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: #033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.291 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.312 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 17:08:54 np0005592767 nova_compute[181681]: <domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <domain>kvm</domain>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <arch>i686</arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <vcpu max='4096'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <iothreads supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <os supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='firmware'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <loader supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>rom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pflash</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='readonly'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>yes</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='secure'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </loader>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </os>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='maximumMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <vendor>AMD</vendor>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='succor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='custom' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <memoryBacking supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='sourceType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>anonymous</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>memfd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </memoryBacking>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <disk supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='diskDevice'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>disk</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cdrom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>floppy</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>lun</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>fdc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>sata</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </disk>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <graphics supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vnc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egl-headless</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </graphics>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <video supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='modelType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vga</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cirrus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>none</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>bochs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ramfb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </video>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hostdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='mode'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>subsystem</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='startupPolicy'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>mandatory</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>requisite</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>optional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='subsysType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pci</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='capsType'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='pciBackend'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hostdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <rng supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>random</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </rng>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <filesystem supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='driverType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>path</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>handle</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtiofs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </filesystem>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tpm supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-tis</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-crb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emulator</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>external</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendVersion'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>2.0</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </tpm>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <redirdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </redirdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <channel supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </channel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <crypto supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </crypto>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <interface supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>passt</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </interface>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <panic supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>isa</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>hyperv</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </panic>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <console supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>null</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dev</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pipe</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stdio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>udp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tcp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu-vdagent</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </console>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <gic supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <genid supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backup supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <async-teardown supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <s390-pv supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <ps2 supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tdx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sev supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sgx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hyperv supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='features'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>relaxed</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vapic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>spinlocks</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vpindex</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>runtime</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>synic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stimer</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reset</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vendor_id</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>frequencies</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reenlightenment</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tlbflush</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ipi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>avic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emsr_bitmap</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>xmm_input</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hyperv>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <launchSecurity supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.321 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 17:08:54 np0005592767 nova_compute[181681]: <domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <domain>kvm</domain>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <arch>i686</arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <vcpu max='240'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <iothreads supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <os supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='firmware'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <loader supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>rom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pflash</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='readonly'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>yes</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='secure'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </loader>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </os>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='maximumMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <vendor>AMD</vendor>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='succor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='custom' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <memoryBacking supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='sourceType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>anonymous</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>memfd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </memoryBacking>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <disk supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='diskDevice'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>disk</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cdrom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>floppy</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>lun</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ide</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>fdc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>sata</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </disk>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <graphics supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vnc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egl-headless</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </graphics>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <video supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='modelType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vga</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cirrus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>none</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>bochs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ramfb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </video>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hostdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='mode'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>subsystem</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='startupPolicy'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>mandatory</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>requisite</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>optional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='subsysType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pci</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='capsType'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='pciBackend'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hostdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <rng supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>random</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </rng>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <filesystem supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='driverType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>path</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>handle</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtiofs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </filesystem>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tpm supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-tis</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-crb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emulator</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>external</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendVersion'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>2.0</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </tpm>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <redirdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </redirdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <channel supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </channel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <crypto supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </crypto>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <interface supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>passt</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </interface>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <panic supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>isa</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>hyperv</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </panic>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <console supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>null</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dev</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pipe</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stdio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>udp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tcp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu-vdagent</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </console>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <gic supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <genid supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backup supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <async-teardown supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <s390-pv supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <ps2 supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tdx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sev supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sgx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hyperv supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='features'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>relaxed</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vapic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>spinlocks</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vpindex</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>runtime</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>synic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stimer</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reset</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vendor_id</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>frequencies</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reenlightenment</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tlbflush</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ipi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>avic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emsr_bitmap</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>xmm_input</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hyperv>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <launchSecurity supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.405 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.410 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 17:08:54 np0005592767 nova_compute[181681]: <domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <domain>kvm</domain>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <arch>x86_64</arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <vcpu max='4096'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <iothreads supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <os supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='firmware'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>efi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <loader supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>rom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pflash</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='readonly'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>yes</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='secure'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>yes</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </loader>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </os>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='maximumMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <vendor>AMD</vendor>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='succor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='custom' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <memoryBacking supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='sourceType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>anonymous</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>memfd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </memoryBacking>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <disk supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='diskDevice'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>disk</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cdrom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>floppy</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>lun</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>fdc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>sata</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </disk>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <graphics supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vnc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egl-headless</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </graphics>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <video supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='modelType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vga</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cirrus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>none</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>bochs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ramfb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </video>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hostdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='mode'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>subsystem</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='startupPolicy'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>mandatory</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>requisite</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>optional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='subsysType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pci</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='capsType'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='pciBackend'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hostdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <rng supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>random</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </rng>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <filesystem supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='driverType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>path</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>handle</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtiofs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </filesystem>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tpm supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-tis</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-crb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emulator</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>external</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendVersion'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>2.0</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </tpm>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <redirdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </redirdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <channel supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </channel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <crypto supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </crypto>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <interface supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>passt</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </interface>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <panic supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>isa</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>hyperv</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </panic>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <console supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>null</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dev</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pipe</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stdio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>udp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tcp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu-vdagent</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </console>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <gic supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <genid supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backup supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <async-teardown supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <s390-pv supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <ps2 supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tdx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sev supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sgx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hyperv supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='features'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>relaxed</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vapic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>spinlocks</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vpindex</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>runtime</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>synic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stimer</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reset</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vendor_id</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>frequencies</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reenlightenment</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tlbflush</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ipi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>avic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emsr_bitmap</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>xmm_input</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hyperv>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <launchSecurity supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.486 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 17:08:54 np0005592767 nova_compute[181681]: <domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <domain>kvm</domain>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <arch>x86_64</arch>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <vcpu max='240'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <iothreads supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <os supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='firmware'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <loader supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>rom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pflash</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='readonly'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>yes</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='secure'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>no</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </loader>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </os>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='maximumMigratable'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>on</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>off</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <vendor>AMD</vendor>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='succor'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <mode name='custom' supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ddpd-u'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sha512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm3'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sm4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Denverton-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amd-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='auto-ibrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='perfmon-v2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbpb'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='stibp-always-on'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='EPYC-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-128'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-256'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx10-512'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='prefetchiti'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Haswell-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512er'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512pf'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fma4'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tbm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xop'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='amx-tile'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-bf16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-fp16'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bitalg'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrc'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fzrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='la57'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='taa-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ifma'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cmpccxadd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fbsdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='fsrs'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ibrs-all'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='intel-psfd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='lam'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mcdt-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pbrsb-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='psdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='serialize'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vaes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='hle'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='rtm'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512bw'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512cd'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512dq'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512f'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='avx512vl'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='invpcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pcid'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='pku'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='mpx'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='core-capability'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='split-lock-detect'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='cldemote'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='erms'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='gfni'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdir64b'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='movdiri'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='xsaves'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='athlon-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='core2duo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='coreduo-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='n270-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='ss'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <blockers model='phenom-v1'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnow'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <feature name='3dnowext'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </blockers>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </mode>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <memoryBacking supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <enum name='sourceType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>anonymous</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <value>memfd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </memoryBacking>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <disk supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='diskDevice'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>disk</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cdrom</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>floppy</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>lun</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ide</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>fdc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>sata</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </disk>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <graphics supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vnc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egl-headless</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </graphics>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <video supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='modelType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vga</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>cirrus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>none</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>bochs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ramfb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </video>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hostdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='mode'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>subsystem</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='startupPolicy'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>mandatory</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>requisite</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>optional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='subsysType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pci</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>scsi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='capsType'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='pciBackend'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hostdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <rng supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtio-non-transitional</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>random</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>egd</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </rng>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <filesystem supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='driverType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>path</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>handle</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>virtiofs</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </filesystem>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tpm supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-tis</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tpm-crb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emulator</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>external</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendVersion'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>2.0</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </tpm>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <redirdev supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='bus'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>usb</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </redirdev>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <channel supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </channel>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <crypto supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendModel'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>builtin</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </crypto>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <interface supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='backendType'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>default</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>passt</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </interface>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <panic supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='model'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>isa</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>hyperv</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </panic>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <console supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='type'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>null</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vc</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pty</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dev</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>file</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>pipe</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stdio</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>udp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tcp</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>unix</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>qemu-vdagent</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>dbus</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </console>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </devices>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <gic supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <genid supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <backup supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <async-teardown supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <s390-pv supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <ps2 supported='yes'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <tdx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sev supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <sgx supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <hyperv supported='yes'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <enum name='features'>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>relaxed</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vapic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>spinlocks</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vpindex</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>runtime</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>synic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>stimer</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reset</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>vendor_id</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>frequencies</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>reenlightenment</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>tlbflush</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>ipi</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>avic</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>emsr_bitmap</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <value>xmm_input</value>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </enum>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      <defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:      </defaults>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    </hyperv>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:    <launchSecurity supported='no'/>
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  </features>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </domainCapabilities>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.569 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.570 181685 INFO nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Secure Boot support detected#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.572 181685 INFO nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.572 181685 INFO nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.582 181685 DEBUG nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 22 17:08:54 np0005592767 nova_compute[181681]:  <model>Nehalem</model>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: </cpu>
Jan 22 17:08:54 np0005592767 nova_compute[181681]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.584 181685 DEBUG nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.626 181685 INFO nova.virt.node [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Determined node identity 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from /var/lib/nova/compute_id#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.643 181685 WARNING nova.compute.manager [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Compute nodes ['8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.675 181685 INFO nova.compute.manager [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.760 181685 WARNING nova.compute.manager [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.760 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.761 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.761 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:08:54 np0005592767 nova_compute[181681]: 2026-01-22 22:08:54.761 181685 DEBUG nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:08:54 np0005592767 systemd[1]: Starting libvirt nodedev daemon...
Jan 22 17:08:54 np0005592767 systemd[1]: Started libvirt nodedev daemon.
Jan 22 17:08:54 np0005592767 python3.9[182363]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.071 181685 WARNING nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.072 181685 DEBUG nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6192MB free_disk=73.58368301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.072 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.072 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.102 181685 WARNING nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] No compute node record for compute-2.ctlplane.example.com:8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec could not be found.#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.121 181685 INFO nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.209 181685 DEBUG nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.210 181685 DEBUG nova.compute.resource_tracker [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:08:55 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.750 181685 INFO nova.scheduler.client.report [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] [req-cee212cd-1def-4ea4-957d-3fe5ff05c369] Created resource provider record via placement API for resource provider with UUID 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec and name compute-2.ctlplane.example.com.#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.779 181685 DEBUG nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 22 17:08:55 np0005592767 nova_compute[181681]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.780 181685 INFO nova.virt.libvirt.host [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.780 181685 DEBUG nova.compute.provider_tree [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.781 181685 DEBUG nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.782 181685 DEBUG nova.virt.libvirt.driver [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Libvirt baseline CPU <cpu>
Jan 22 17:08:55 np0005592767 nova_compute[181681]:  <arch>x86_64</arch>
Jan 22 17:08:55 np0005592767 nova_compute[181681]:  <model>Nehalem</model>
Jan 22 17:08:55 np0005592767 nova_compute[181681]:  <vendor>AMD</vendor>
Jan 22 17:08:55 np0005592767 nova_compute[181681]:  <topology sockets="8" cores="1" threads="1"/>
Jan 22 17:08:55 np0005592767 nova_compute[181681]: </cpu>
Jan 22 17:08:55 np0005592767 nova_compute[181681]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.844 181685 DEBUG nova.scheduler.client.report [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Updated inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.845 181685 DEBUG nova.compute.provider_tree [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Updating resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.845 181685 DEBUG nova.compute.provider_tree [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:08:55 np0005592767 python3.9[182561]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 22 17:08:55 np0005592767 systemd[1]: Stopping nova_compute container...
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.933 181685 DEBUG nova.compute.provider_tree [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Updating resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.946 181685 DEBUG oslo_concurrency.lockutils [None req-a34f67cb-93eb-4ea4-8ba0-90db62ecda65 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.946 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.946 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:08:55 np0005592767 nova_compute[181681]: 2026-01-22 22:08:55.946 181685 DEBUG oslo_concurrency.lockutils [None req-1c255114-a266-4af7-9143-59935e9897b5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:08:56 np0005592767 virtqemud[182095]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 17:08:56 np0005592767 virtqemud[182095]: hostname: compute-2
Jan 22 17:08:56 np0005592767 virtqemud[182095]: End of file while reading data: Input/output error
Jan 22 17:08:56 np0005592767 systemd[1]: libpod-f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604.scope: Deactivated successfully.
Jan 22 17:08:56 np0005592767 systemd[1]: libpod-f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604.scope: Consumed 3.245s CPU time.
Jan 22 17:08:56 np0005592767 podman[182565]: 2026-01-22 22:08:56.375480785 +0000 UTC m=+0.465410621 container died f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 22 17:08:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604-userdata-shm.mount: Deactivated successfully.
Jan 22 17:08:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay-0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5-merged.mount: Deactivated successfully.
Jan 22 17:08:56 np0005592767 podman[182565]: 2026-01-22 22:08:56.428754638 +0000 UTC m=+0.518684454 container cleanup f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:08:56 np0005592767 podman[182565]: nova_compute
Jan 22 17:08:56 np0005592767 podman[182593]: nova_compute
Jan 22 17:08:56 np0005592767 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 22 17:08:56 np0005592767 systemd[1]: Stopped nova_compute container.
Jan 22 17:08:56 np0005592767 systemd[1]: Starting nova_compute container...
Jan 22 17:08:56 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:08:56 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:56 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:56 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:56 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:56 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7561c21cba8294c23b66ded2b934324c95ed798ffeb8059f4fd4e44ba26cb5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:56 np0005592767 podman[182607]: 2026-01-22 22:08:56.602311298 +0000 UTC m=+0.084969135 container init f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute)
Jan 22 17:08:56 np0005592767 podman[182607]: 2026-01-22 22:08:56.607199177 +0000 UTC m=+0.089856994 container start f7b57391d5eb90e02983e7f0d1e55bed6a243d8869db0a6fe9444f1782e49604 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:08:56 np0005592767 podman[182607]: nova_compute
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + sudo -E kolla_set_configs
Jan 22 17:08:56 np0005592767 systemd[1]: Started nova_compute container.
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Validating config file
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying service configuration files
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /etc/ceph
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Creating directory /etc/ceph
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /etc/ceph
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Writing out command to execute
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:56 np0005592767 nova_compute[182623]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 22 17:08:56 np0005592767 nova_compute[182623]: ++ cat /run_command
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + CMD=nova-compute
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + ARGS=
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + sudo kolla_copy_cacerts
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + [[ ! -n '' ]]
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + . kolla_extend_start
Jan 22 17:08:56 np0005592767 nova_compute[182623]: Running command: 'nova-compute'
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + echo 'Running command: '\''nova-compute'\'''
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + umask 0022
Jan 22 17:08:56 np0005592767 nova_compute[182623]: + exec nova-compute
Jan 22 17:08:58 np0005592767 python3.9[182787]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 22 17:08:58 np0005592767 systemd[1]: Started libpod-conmon-7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697.scope.
Jan 22 17:08:58 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:08:58 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22627d0c3b0aae8d47a711e318211cf68bca2c044d21b259740c6922647d0c7a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:58 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22627d0c3b0aae8d47a711e318211cf68bca2c044d21b259740c6922647d0c7a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:58 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22627d0c3b0aae8d47a711e318211cf68bca2c044d21b259740c6922647d0c7a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 22 17:08:58 np0005592767 podman[182814]: 2026-01-22 22:08:58.662208038 +0000 UTC m=+0.136834338 container init 7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:08:58 np0005592767 podman[182814]: 2026-01-22 22:08:58.668547708 +0000 UTC m=+0.143173998 container start 7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=nova_compute_init)
Jan 22 17:08:58 np0005592767 python3.9[182787]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.707 182627 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.708 182627 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.708 182627 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.708 182627 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Applying nova statedir ownership
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 22 17:08:58 np0005592767 nova_compute_init[182838]: INFO:nova_statedir:Nova statedir ownership complete
Jan 22 17:08:58 np0005592767 systemd[1]: libpod-7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697.scope: Deactivated successfully.
Jan 22 17:08:58 np0005592767 podman[182853]: 2026-01-22 22:08:58.778518821 +0000 UTC m=+0.026036900 container died 7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:08:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697-userdata-shm.mount: Deactivated successfully.
Jan 22 17:08:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay-22627d0c3b0aae8d47a711e318211cf68bca2c044d21b259740c6922647d0c7a-merged.mount: Deactivated successfully.
Jan 22 17:08:58 np0005592767 podman[182853]: 2026-01-22 22:08:58.82886036 +0000 UTC m=+0.076378409 container cleanup 7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 17:08:58 np0005592767 systemd[1]: libpod-conmon-7d056831e47e58f6dd0d86aefafd8bf29421022eefbbc1375a6bf91428613697.scope: Deactivated successfully.
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.861 182627 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.885 182627 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:08:58 np0005592767 nova_compute[182623]: 2026-01-22 22:08:58.886 182627 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.388 182627 INFO nova.virt.driver [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.502 182627 INFO nova.compute.provider_config [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.528 182627 DEBUG oslo_concurrency.lockutils [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.529 182627 DEBUG oslo_concurrency.lockutils [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.529 182627 DEBUG oslo_concurrency.lockutils [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.530 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.530 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.530 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.530 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.530 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.531 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.532 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.532 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.532 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.532 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.532 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.533 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.534 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.534 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.534 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.534 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.534 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.535 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.535 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.535 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.535 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.535 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.536 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.537 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.537 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.537 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.537 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.538 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.538 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.538 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.538 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.538 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.539 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.539 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.539 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.539 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.539 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.540 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.540 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.540 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.540 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.540 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.541 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.542 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.542 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.542 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.542 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.542 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.543 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.544 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.545 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.545 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.545 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.545 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.545 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.546 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.546 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.546 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.546 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.546 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.547 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.548 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.548 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.548 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.548 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.548 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.549 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.549 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.549 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.549 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.549 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.550 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.550 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.550 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.550 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.550 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.551 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.551 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.551 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.551 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.551 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.552 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.552 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.552 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.552 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.552 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.553 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.554 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.554 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.554 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.554 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.554 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.555 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.555 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.555 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.555 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.555 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.556 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.557 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.557 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.557 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.557 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.558 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.558 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.558 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.558 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.558 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.559 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.559 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.559 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.559 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.559 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.560 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.560 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.560 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.560 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.560 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.561 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.562 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.562 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.562 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.562 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.562 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.563 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.563 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.563 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.563 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.563 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.564 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.564 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.564 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.564 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.565 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.565 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.565 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.565 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.565 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.566 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.566 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.566 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.566 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.566 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.567 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.568 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.569 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.570 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.571 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.572 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.573 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.574 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.575 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.576 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.577 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.578 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.579 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.580 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.581 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.582 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.583 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.584 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.585 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.586 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.587 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.588 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.589 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.590 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.590 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.590 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.590 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.591 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.592 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.593 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.594 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.595 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.596 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.597 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.598 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.599 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.600 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.601 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.602 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.603 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.604 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.605 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.606 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.607 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.608 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.609 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 WARNING oslo_config.cfg [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 22 17:08:59 np0005592767 nova_compute[182623]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 22 17:08:59 np0005592767 nova_compute[182623]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 22 17:08:59 np0005592767 nova_compute[182623]: and ``live_migration_inbound_addr`` respectively.
Jan 22 17:08:59 np0005592767 nova_compute[182623]: ).  Its value may be silently ignored in the future.#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.610 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.611 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.612 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.613 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.614 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.615 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.615 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.615 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.615 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.615 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.616 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.617 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.618 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.619 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.620 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.621 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.622 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.623 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.624 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.625 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.626 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.627 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.628 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.629 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.630 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.631 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.632 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.633 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.634 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.635 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.636 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.637 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.638 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.639 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.640 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.641 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.642 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.643 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.644 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.645 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.646 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.646 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.646 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.646 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.646 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.647 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.648 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.649 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.650 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.651 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.652 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.653 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.654 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.655 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.656 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.657 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.658 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.658 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.658 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.658 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.659 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.659 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.659 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.659 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.660 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.661 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.661 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.661 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.661 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.661 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.662 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 systemd[1]: session-24.scope: Deactivated successfully.
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 systemd[1]: session-24.scope: Consumed 1min 30.432s CPU time.
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 systemd-logind[802]: Session 24 logged out. Waiting for processes to exit.
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.663 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.664 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.664 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.664 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 systemd-logind[802]: Removed session 24.
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.665 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.666 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.667 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.667 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.667 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.667 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.667 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.668 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.669 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.670 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.670 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.670 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.670 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.670 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.671 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.672 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.673 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.674 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.674 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.674 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.674 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.674 182627 DEBUG oslo_service.service [None req-b1f69d36-d7c1-4c0f-b15f-046559939cbd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.676 182627 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.688 182627 INFO nova.virt.node [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Determined node identity 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from /var/lib/nova/compute_id#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.688 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.689 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.689 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.690 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.701 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f723f3fc340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.703 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f723f3fc340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.704 182627 INFO nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.710 182627 INFO nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt host capabilities <capabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <host>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <uuid>094772d4-6a6e-4838-98c5-520e3f85ea8a</uuid>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <arch>x86_64</arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model>EPYC-Rome-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <vendor>AMD</vendor>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <microcode version='16777317'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <signature family='23' model='49' stepping='0'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='x2apic'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='tsc-deadline'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='osxsave'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='hypervisor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='tsc_adjust'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='spec-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='stibp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='arch-capabilities'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='cmp_legacy'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='topoext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='virt-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='lbrv'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='tsc-scale'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='vmcb-clean'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='pause-filter'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='pfthreshold'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='svme-addr-chk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='rdctl-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='skip-l1dfl-vmentry'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='mds-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature name='pschange-mc-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <pages unit='KiB' size='4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <pages unit='KiB' size='2048'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <pages unit='KiB' size='1048576'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <power_management>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <suspend_mem/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <suspend_disk/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <suspend_hybrid/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </power_management>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <iommu support='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <migration_features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <live/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <uri_transports>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <uri_transport>tcp</uri_transport>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <uri_transport>rdma</uri_transport>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </uri_transports>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </migration_features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <topology>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <cells num='1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <cell id='0'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <memory unit='KiB'>7864304</memory>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <pages unit='KiB' size='2048'>0</pages>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <distances>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <sibling id='0' value='10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          </distances>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          <cpus num='8'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:          </cpus>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        </cell>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </cells>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </topology>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <cache>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </cache>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <secmodel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model>selinux</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <doi>0</doi>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </secmodel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <secmodel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model>dac</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <doi>0</doi>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </secmodel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </host>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <guest>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <os_type>hvm</os_type>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <arch name='i686'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <wordsize>32</wordsize>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <domain type='qemu'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <domain type='kvm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <pae/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <nonpae/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <acpi default='on' toggle='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <apic default='on' toggle='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <cpuselection/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <deviceboot/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <disksnapshot default='on' toggle='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <externalSnapshot/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </guest>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <guest>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <os_type>hvm</os_type>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <arch name='x86_64'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <wordsize>64</wordsize>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <domain type='qemu'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <domain type='kvm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <acpi default='on' toggle='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <apic default='on' toggle='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <cpuselection/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <deviceboot/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <disksnapshot default='on' toggle='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <externalSnapshot/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </guest>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 
Jan 22 17:08:59 np0005592767 nova_compute[182623]: </capabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: #033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.719 182627 DEBUG nova.virt.libvirt.volume.mount [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.723 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.727 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 22 17:08:59 np0005592767 nova_compute[182623]: <domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <domain>kvm</domain>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <arch>i686</arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <vcpu max='240'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <iothreads supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <os supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='firmware'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <loader supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>rom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pflash</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='readonly'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>yes</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='secure'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </loader>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='maximumMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <vendor>AMD</vendor>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='succor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='custom' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <memoryBacking supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='sourceType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>anonymous</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>memfd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </memoryBacking>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <disk supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='diskDevice'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>disk</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cdrom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>floppy</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>lun</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ide</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>fdc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>sata</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <graphics supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vnc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egl-headless</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <video supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='modelType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vga</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cirrus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>none</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>bochs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ramfb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hostdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='mode'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>subsystem</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='startupPolicy'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>mandatory</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>requisite</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>optional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='subsysType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pci</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='capsType'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='pciBackend'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hostdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <rng supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>random</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <filesystem supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='driverType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>path</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>handle</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtiofs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </filesystem>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tpm supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-tis</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-crb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emulator</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>external</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendVersion'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>2.0</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </tpm>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <redirdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </redirdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <channel supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </channel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <crypto supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </crypto>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <interface supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>passt</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <panic supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>isa</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>hyperv</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </panic>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <console supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>null</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dev</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pipe</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stdio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>udp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tcp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu-vdagent</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <gic supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <genid supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backup supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <async-teardown supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <s390-pv supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <ps2 supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tdx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sev supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sgx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hyperv supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='features'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>relaxed</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vapic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>spinlocks</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vpindex</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>runtime</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>synic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stimer</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reset</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vendor_id</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>frequencies</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reenlightenment</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tlbflush</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ipi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>avic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emsr_bitmap</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>xmm_input</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hyperv>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <launchSecurity supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: </domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.736 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 22 17:08:59 np0005592767 nova_compute[182623]: <domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <domain>kvm</domain>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <arch>i686</arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <vcpu max='4096'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <iothreads supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <os supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='firmware'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <loader supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>rom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pflash</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='readonly'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>yes</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='secure'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </loader>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='maximumMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <vendor>AMD</vendor>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='succor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='custom' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <memoryBacking supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='sourceType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>anonymous</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>memfd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </memoryBacking>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <disk supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='diskDevice'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>disk</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cdrom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>floppy</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>lun</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>fdc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>sata</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <graphics supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vnc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egl-headless</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <video supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='modelType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vga</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cirrus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>none</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>bochs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ramfb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hostdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='mode'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>subsystem</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='startupPolicy'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>mandatory</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>requisite</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>optional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='subsysType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pci</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='capsType'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='pciBackend'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hostdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <rng supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>random</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <filesystem supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='driverType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>path</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>handle</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtiofs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </filesystem>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tpm supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-tis</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-crb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emulator</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>external</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendVersion'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>2.0</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </tpm>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <redirdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </redirdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <channel supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </channel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <crypto supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </crypto>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <interface supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>passt</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <panic supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>isa</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>hyperv</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </panic>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <console supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>null</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dev</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pipe</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stdio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>udp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tcp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu-vdagent</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <gic supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <genid supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backup supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <async-teardown supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <s390-pv supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <ps2 supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tdx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sev supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sgx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hyperv supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='features'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>relaxed</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vapic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>spinlocks</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vpindex</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>runtime</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>synic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stimer</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reset</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vendor_id</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>frequencies</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reenlightenment</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tlbflush</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ipi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>avic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emsr_bitmap</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>xmm_input</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hyperv>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <launchSecurity supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: </domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.800 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.804 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 22 17:08:59 np0005592767 nova_compute[182623]: <domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <domain>kvm</domain>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <arch>x86_64</arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <vcpu max='240'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <iothreads supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <os supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='firmware'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <loader supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>rom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pflash</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='readonly'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>yes</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='secure'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </loader>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='maximumMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <vendor>AMD</vendor>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='succor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='custom' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Dhyana-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='athlon-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='core2duo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='coreduo-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='n270-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='phenom-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <memoryBacking supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='sourceType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>anonymous</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>memfd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </memoryBacking>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <disk supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='diskDevice'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>disk</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cdrom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>floppy</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>lun</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ide</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>fdc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>sata</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <graphics supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vnc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egl-headless</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <video supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='modelType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vga</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>cirrus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>none</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>bochs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ramfb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hostdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='mode'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>subsystem</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='startupPolicy'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>mandatory</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>requisite</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>optional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='subsysType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pci</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='capsType'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='pciBackend'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hostdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <rng supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>random</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>egd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <filesystem supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='driverType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>path</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>handle</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>virtiofs</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </filesystem>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tpm supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-tis</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tpm-crb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emulator</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>external</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendVersion'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>2.0</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </tpm>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <redirdev supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </redirdev>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <channel supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </channel>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <crypto supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </crypto>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <interface supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='backendType'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>passt</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <panic supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>isa</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>hyperv</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </panic>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <console supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>null</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vc</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dev</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>file</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pipe</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stdio</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>udp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tcp</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>qemu-vdagent</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <gic supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <vmcoreinfo supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <genid supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backingStoreInput supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <backup supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <async-teardown supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <s390-pv supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <ps2 supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <tdx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sev supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <sgx supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <hyperv supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='features'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>relaxed</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vapic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>spinlocks</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vpindex</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>runtime</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>synic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>stimer</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reset</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>vendor_id</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>frequencies</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>reenlightenment</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>tlbflush</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>ipi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>avic</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>emsr_bitmap</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>xmm_input</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <spinlocks>4095</spinlocks>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <stimer_direct>on</stimer_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </defaults>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </hyperv>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <launchSecurity supported='no'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: </domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:08:59 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.892 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 22 17:08:59 np0005592767 nova_compute[182623]: <domainCapabilities>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <path>/usr/libexec/qemu-kvm</path>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <domain>kvm</domain>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <arch>x86_64</arch>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <vcpu max='4096'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <iothreads supported='yes'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <os supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <enum name='firmware'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>efi</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <loader supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>rom</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>pflash</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='readonly'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>yes</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='secure'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>yes</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>no</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </loader>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:  <cpu>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-passthrough' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='hostPassthroughMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='maximum' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <enum name='maximumMigratable'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>on</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <value>off</value>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='host-model' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <vendor>AMD</vendor>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='x2apic'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-deadline'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='hypervisor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc_adjust'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='spec-ctrl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='stibp'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='cmp_legacy'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='overflow-recov'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='succor'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='ibrs'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='amd-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='virt-ssbd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lbrv'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='tsc-scale'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='vmcb-clean'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='flushbyasid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pause-filter'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='pfthreshold'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='svme-addr-chk'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <feature policy='disable' name='xsaves'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:    <mode name='custom' supported='yes'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v2'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v3'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Broadwell-v4'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v1'>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:08:59 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cascadelake-Server-v5'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='ClearwaterForest-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ddpd-u'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sha512'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sm3'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sm4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Cooperlake-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Denverton'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Denverton-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Dhyana-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Genoa-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Milan-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Rome-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-Turin-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amd-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='auto-ibrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vp2intersect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fs-gs-base-ns'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibpb-brtype'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='no-nested-data-bp'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='null-sel-clr-base'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='perfmon-v2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbpb'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='srso-user-kernel-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='stibp-always-on'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='EPYC-v5'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='GraniteRapids-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-128'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-256'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx10-512'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='prefetchiti'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Haswell-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-noTSX'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v5'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v6'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Icelake-Server-v7'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='IvyBridge-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='KnightsMill-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-4fmaps'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-4vnniw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512er'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512pf'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G4-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Opteron_G5-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fma4'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tbm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xop'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SapphireRapids-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='amx-tile'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-bf16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-fp16'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512-vpopcntdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bitalg'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vbmi2'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrc'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fzrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='la57'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='taa-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='tsx-ldtrk'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SierraForest'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='SierraForest-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ifma'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-ne-convert'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx-vnni-int8'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bhi-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='bus-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cmpccxadd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fbsdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='fsrs'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ibrs-all'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='intel-psfd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ipred-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='lam'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mcdt-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pbrsb-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='psdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rrsba-ctrl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='sbdr-ssdp-no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='serialize'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vaes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='vpclmulqdq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Client-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='hle'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='rtm'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Skylake-Server-v5'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512bw'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512cd'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512dq'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512f'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='avx512vl'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='invpcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pcid'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='pku'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Snowridge'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='mpx'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v2'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v3'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='core-capability'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='split-lock-detect'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='Snowridge-v4'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='cldemote'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='erms'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='gfni'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdir64b'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='movdiri'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='xsaves'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='athlon'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='athlon-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='core2duo'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='core2duo-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='coreduo'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='coreduo-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='n270'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='n270-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='ss'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='phenom'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <blockers model='phenom-v1'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnow'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <feature name='3dnowext'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </blockers>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </mode>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <memoryBacking supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <enum name='sourceType'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <value>file</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <value>anonymous</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <value>memfd</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  </memoryBacking>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <disk supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='diskDevice'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>disk</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>cdrom</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>floppy</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>lun</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>fdc</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>sata</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <graphics supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vnc</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>egl-headless</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <video supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='modelType'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vga</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>cirrus</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>none</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>bochs</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>ramfb</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <hostdev supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='mode'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>subsystem</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='startupPolicy'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>mandatory</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>requisite</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>optional</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='subsysType'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>pci</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>scsi</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='capsType'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='pciBackend'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </hostdev>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <rng supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio-transitional</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtio-non-transitional</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>random</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>egd</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <filesystem supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='driverType'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>path</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>handle</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>virtiofs</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </filesystem>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <tpm supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>tpm-tis</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>tpm-crb</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>emulator</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>external</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='backendVersion'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>2.0</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </tpm>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <redirdev supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='bus'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>usb</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </redirdev>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <channel supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </channel>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <crypto supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='model'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>qemu</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='backendModel'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>builtin</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </crypto>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <interface supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='backendType'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>default</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>passt</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <panic supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='model'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>isa</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>hyperv</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </panic>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <console supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='type'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>null</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vc</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>pty</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>dev</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>file</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>pipe</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>stdio</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>udp</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>tcp</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>unix</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>qemu-vdagent</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>dbus</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <gic supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <vmcoreinfo supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <genid supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <backingStoreInput supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <backup supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <async-teardown supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <s390-pv supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <ps2 supported='yes'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <tdx supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <sev supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <sgx supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <hyperv supported='yes'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <enum name='features'>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>relaxed</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vapic</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>spinlocks</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vpindex</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>runtime</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>synic</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>stimer</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>reset</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>vendor_id</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>frequencies</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>reenlightenment</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>tlbflush</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>ipi</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>avic</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>emsr_bitmap</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <value>xmm_input</value>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </enum>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      <defaults>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <spinlocks>4095</spinlocks>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <stimer_direct>on</stimer_direct>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <tlbflush_direct>on</tlbflush_direct>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <tlbflush_extended>on</tlbflush_extended>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:      </defaults>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    </hyperv>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:    <launchSecurity supported='no'/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: </domainCapabilities>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.974 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.974 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.974 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.980 182627 INFO nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Secure Boot support detected#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.982 182627 INFO nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.983 182627 INFO nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.991 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <model>Nehalem</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: </cpu>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:08:59.993 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.009 182627 INFO nova.virt.node [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Determined node identity 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from /var/lib/nova/compute_id#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.023 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Verified node 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.046 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.126 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.127 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.127 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.127 182627 DEBUG nova.compute.resource_tracker [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.264 182627 WARNING nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.265 182627 DEBUG nova.compute.resource_tracker [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6176MB free_disk=73.58271789550781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.265 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.265 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.349 182627 DEBUG nova.compute.resource_tracker [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.349 182627 DEBUG nova.compute.resource_tracker [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.400 182627 DEBUG nova.scheduler.client.report [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.420 182627 DEBUG nova.scheduler.client.report [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.420 182627 DEBUG nova.compute.provider_tree [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.466 182627 DEBUG nova.scheduler.client.report [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.496 182627 DEBUG nova.scheduler.client.report [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.519 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 22 17:09:00 np0005592767 nova_compute[182623]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.519 182627 INFO nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.520 182627 DEBUG nova.compute.provider_tree [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.520 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.522 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Libvirt baseline CPU <cpu>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <arch>x86_64</arch>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <model>Nehalem</model>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <vendor>AMD</vendor>
Jan 22 17:09:00 np0005592767 nova_compute[182623]:  <topology sockets="8" cores="1" threads="1"/>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: </cpu>
Jan 22 17:09:00 np0005592767 nova_compute[182623]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.545 182627 DEBUG nova.scheduler.client.report [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.563 182627 DEBUG nova.compute.resource_tracker [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.563 182627 DEBUG oslo_concurrency.lockutils [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.563 182627 DEBUG nova.service [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.639 182627 DEBUG nova.service [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 22 17:09:00 np0005592767 nova_compute[182623]: 2026-01-22 22:09:00.639 182627 DEBUG nova.servicegroup.drivers.db [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 22 17:09:04 np0005592767 systemd-logind[802]: New session 26 of user zuul.
Jan 22 17:09:04 np0005592767 systemd[1]: Started Session 26 of User zuul.
Jan 22 17:09:05 np0005592767 python3.9[183076]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 22 17:09:07 np0005592767 python3.9[183232]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:09:07 np0005592767 systemd[1]: Reloading.
Jan 22 17:09:07 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:09:07 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:09:08 np0005592767 python3.9[183418]: ansible-ansible.builtin.service_facts Invoked
Jan 22 17:09:09 np0005592767 network[183435]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 22 17:09:09 np0005592767 network[183436]: 'network-scripts' will be removed from distribution in near future.
Jan 22 17:09:09 np0005592767 network[183437]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 22 17:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:09:12.077 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:09:12.079 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:09:12.080 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:09:12 np0005592767 podman[183582]: 2026-01-22 22:09:12.20314874 +0000 UTC m=+0.122857581 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:09:14 np0005592767 python3.9[183736]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:09:15 np0005592767 python3.9[183889]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:15 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:09:15 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:09:16 np0005592767 python3.9[184042]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:17 np0005592767 python3.9[184194]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:09:18 np0005592767 podman[184273]: 2026-01-22 22:09:18.125970769 +0000 UTC m=+0.047882754 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 17:09:18 np0005592767 python3.9[184365]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 22 17:09:19 np0005592767 python3.9[184517]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:09:19 np0005592767 systemd[1]: Reloading.
Jan 22 17:09:19 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:09:19 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:09:20 np0005592767 python3.9[184704]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 22 17:09:21 np0005592767 python3.9[184857]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:22 np0005592767 python3.9[185007]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:23 np0005592767 python3.9[185161]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 22 17:09:24 np0005592767 python3.9[185313]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 22 17:09:25 np0005592767 python3.9[185466]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 22 17:09:27 np0005592767 python3.9[185624]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 22 17:09:29 np0005592767 python3.9[185784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:30 np0005592767 python3.9[185905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119769.027828-521-246094057135848/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:30 np0005592767 python3.9[186055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:31 np0005592767 python3.9[186176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119770.2734125-521-211422923678321/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:31 np0005592767 python3.9[186326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:32 np0005592767 python3.9[186447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769119771.4229836-521-87875766852086/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:33 np0005592767 python3.9[186597]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:34 np0005592767 python3.9[186749]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:34 np0005592767 python3.9[186901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:35 np0005592767 python3.9[187022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119774.4874494-698-43763041944316/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:36 np0005592767 python3.9[187172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:36 np0005592767 python3.9[187293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119775.6379776-698-75034762096166/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:37 np0005592767 python3.9[187443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:38 np0005592767 python3.9[187564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119777.0754433-785-155485177432379/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:39 np0005592767 python3.9[187714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:39 np0005592767 python3.9[187835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119778.7563024-832-108988717154321/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:40 np0005592767 python3.9[187985]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:41 np0005592767 python3.9[188106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119780.3666642-878-53854977790704/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:42 np0005592767 python3.9[188256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:42 np0005592767 podman[188351]: 2026-01-22 22:09:42.682184057 +0000 UTC m=+0.096535819 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:09:42 np0005592767 python3.9[188388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119781.8067276-923-72607043167030/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:43 np0005592767 python3.9[188553]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:44 np0005592767 python3.9[188705]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:45 np0005592767 python3.9[188855]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:46 np0005592767 python3.9[189007]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:46 np0005592767 python3.9[189159]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:09:47 np0005592767 python3.9[189313]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:48 np0005592767 podman[189437]: 2026-01-22 22:09:48.350890616 +0000 UTC m=+0.060667733 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:09:48 np0005592767 python3.9[189483]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:09:48 np0005592767 systemd[1]: Reloading.
Jan 22 17:09:48 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:09:48 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:09:48 np0005592767 systemd[1]: Listening on Podman API Socket.
Jan 22 17:09:50 np0005592767 python3.9[189676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:50 np0005592767 python3.9[189799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119789.823519-1139-212768726042542/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:51 np0005592767 python3.9[189875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:51 np0005592767 python3.9[189998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119789.823519-1139-212768726042542/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:53 np0005592767 python3.9[190150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:53 np0005592767 nova_compute[182623]: 2026-01-22 22:09:53.642 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:53 np0005592767 nova_compute[182623]: 2026-01-22 22:09:53.671 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:54 np0005592767 python3.9[190302]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:09:54 np0005592767 python3.9[190454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:09:55 np0005592767 python3.9[190577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119794.370576-1283-128800106035676/.source.json _original_basename=.r_e7pngn follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:56 np0005592767 python3.9[190727]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:09:58 np0005592767 python3.9[191150]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.913 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.915 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.944 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:09:58 np0005592767 nova_compute[182623]: 2026-01-22 22:09:58.944 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.090 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.091 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6156MB free_disk=73.58277893066406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.091 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.092 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.161 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.162 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.182 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.200 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.202 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:09:59 np0005592767 nova_compute[182623]: 2026-01-22 22:09:59.202 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:09:59 np0005592767 python3.9[191302]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 17:10:00 np0005592767 python3[191454]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 17:10:01 np0005592767 podman[191491]: 2026-01-22 22:10:01.063288904 +0000 UTC m=+0.061341824 container create cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:10:01 np0005592767 podman[191491]: 2026-01-22 22:10:01.037370464 +0000 UTC m=+0.035423414 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 22 17:10:01 np0005592767 python3[191454]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=d88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 22 17:10:01 np0005592767 python3.9[191681]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:10:02 np0005592767 python3.9[191835]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:03 np0005592767 python3.9[191911]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 22 17:10:03 np0005592767 python3.9[192062]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769119803.3596358-1516-94864976074060/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:04 np0005592767 python3.9[192138]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 22 17:10:04 np0005592767 systemd[1]: Reloading.
Jan 22 17:10:04 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:10:04 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:10:05 np0005592767 python3.9[192249]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 22 17:10:05 np0005592767 systemd[1]: Reloading.
Jan 22 17:10:05 np0005592767 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 22 17:10:05 np0005592767 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 22 17:10:05 np0005592767 systemd[1]: Starting ceilometer_agent_compute container...
Jan 22 17:10:05 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:10:05 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d4ed7290a4dcbef6afce0dbe0d1c22aedaf9301b9b3a2078c6d4fdef064e5f/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:05 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d4ed7290a4dcbef6afce0dbe0d1c22aedaf9301b9b3a2078c6d4fdef064e5f/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:05 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d4ed7290a4dcbef6afce0dbe0d1c22aedaf9301b9b3a2078c6d4fdef064e5f/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:05 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d4ed7290a4dcbef6afce0dbe0d1c22aedaf9301b9b3a2078c6d4fdef064e5f/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 22 17:10:05 np0005592767 systemd[1]: Started /usr/bin/podman healthcheck run cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c.
Jan 22 17:10:05 np0005592767 podman[192288]: 2026-01-22 22:10:05.999014643 +0000 UTC m=+0.113275626 container init cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + sudo -E kolla_set_configs
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: sudo: unable to send audit message: Operation not permitted
Jan 22 17:10:06 np0005592767 podman[192288]: 2026-01-22 22:10:06.034024913 +0000 UTC m=+0.148285886 container start cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:10:06 np0005592767 podman[192288]: ceilometer_agent_compute
Jan 22 17:10:06 np0005592767 systemd[1]: Started ceilometer_agent_compute container.
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Validating config file
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Copying service configuration files
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: INFO:__main__:Writing out command to execute
Jan 22 17:10:06 np0005592767 podman[192311]: 2026-01-22 22:10:06.099021405 +0000 UTC m=+0.050734879 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: ++ cat /run_command
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + ARGS=
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + sudo kolla_copy_cacerts
Jan 22 17:10:06 np0005592767 systemd[1]: cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c-45dadc6dbad6d870.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 17:10:06 np0005592767 systemd[1]: cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c-45dadc6dbad6d870.service: Failed with result 'exit-code'.
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: sudo: unable to send audit message: Operation not permitted
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + [[ ! -n '' ]]
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + . kolla_extend_start
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + umask 0022
Jan 22 17:10:06 np0005592767 ceilometer_agent_compute[192304]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.069 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.069 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.069 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.069 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.070 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.071 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.072 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.073 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.074 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.075 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.076 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.077 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.078 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.079 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.080 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.081 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.082 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.083 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.084 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.085 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.085 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.085 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.085 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.102 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.103 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.103 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.200 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.280 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.280 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.280 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.280 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.281 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.282 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.283 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.284 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.285 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.286 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.287 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.288 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.289 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.290 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.291 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.292 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.293 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.294 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.295 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.296 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.297 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.298 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.299 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.302 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.309 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:10:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:10:09 np0005592767 python3.9[192493]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 22 17:10:10 np0005592767 python3.9[192645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:10:10 np0005592767 python3.9[192770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769119809.8527339-1652-157508691549500/.source.yaml _original_basename=.b4g8d7vs follow=False checksum=58334609f985ec5c9646ffb07fd0ee42148b1595 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:11 np0005592767 python3.9[192922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:10:12.078 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:10:12.080 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:10:12.080 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:10:12 np0005592767 python3.9[193045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769119811.2252576-1697-171624109040253/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:10:13 np0005592767 podman[193116]: 2026-01-22 22:10:13.224048367 +0000 UTC m=+0.134195040 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 17:10:13 np0005592767 python3.9[193222]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:14 np0005592767 python3.9[193374]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 22 17:10:15 np0005592767 python3.9[193526]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 22 17:10:15 np0005592767 python3.9[193604]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.guqfr30v recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:16 np0005592767 python3.9[193754]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 22 17:10:18 np0005592767 python3.9[194177]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 22 17:10:19 np0005592767 podman[194226]: 2026-01-22 22:10:19.126060081 +0000 UTC m=+0.043469999 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:10:19 np0005592767 python3.9[194348]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 22 17:10:20 np0005592767 python3[194500]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 22 17:10:20 np0005592767 podman[194535]: 2026-01-22 22:10:20.670510271 +0000 UTC m=+0.019687993 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 22 17:10:20 np0005592767 podman[194535]: 2026-01-22 22:10:20.85117562 +0000 UTC m=+0.200353332 container create a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:10:20 np0005592767 python3[194500]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.453 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.472 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.482 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-935ae42e-b5f1-4d78-a90f-1ad8097b19ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.483 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.483 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.484 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.484 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.484 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.485 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.485 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.485 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.486 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.511 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.512 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.513 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.513 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.654 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.751 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:09 np0005592767 rsyslogd[1009]: imjournal: 2763 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.753 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.808 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.813 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.868 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.869 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:09 np0005592767 nova_compute[182623]: 2026-01-22 22:16:09.930 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.074 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.075 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5452MB free_disk=73.3401870727539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.076 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.076 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.171 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 935ae42e-b5f1-4d78-a90f-1ad8097b19ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.171 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.172 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.172 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:16:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:10Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:59:ab 10.1.0.187
Jan 22 17:16:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:10Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:59:ab 10.1.0.187
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.283 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.285 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.314 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.338 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.338 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.903 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.903 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.904 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.904 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.905 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.918 182627 INFO nova.compute.manager [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Terminating instance#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.933 182627 DEBUG nova.compute.manager [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:16:10 np0005592767 kernel: tapdcbcd25d-00 (unregistering): left promiscuous mode
Jan 22 17:16:10 np0005592767 NetworkManager[54973]: <info>  [1769120170.9512] device (tapdcbcd25d-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:16:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:10Z|00032|binding|INFO|Releasing lport dcbcd25d-0015-4b14-a104-ee8da7621961 from this chassis (sb_readonly=0)
Jan 22 17:16:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:10Z|00033|binding|INFO|Setting lport dcbcd25d-0015-4b14-a104-ee8da7621961 down in Southbound
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:10Z|00034|binding|INFO|Removing iface tapdcbcd25d-00 ovn-installed in OVS
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:10.965 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:59:ab 10.1.0.187 fdfe:381f:8400:2::2f5'], port_security=['fa:16:3e:8b:59:ab 10.1.0.187 fdfe:381f:8400:2::2f5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.187/26 fdfe:381f:8400:2::2f5/64', 'neutron:device_id': '27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5c10a85a248465c960e573d380cd07d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e4ec01-67f2-4a9a-8ea0-8e8df5ac239e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44f330dd-5be1-46a4-b9d9-2996e37af063, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=dcbcd25d-0015-4b14-a104-ee8da7621961) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:16:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:10.968 104135 INFO neutron.agent.ovn.metadata.agent [-] Port dcbcd25d-0015-4b14-a104-ee8da7621961 in datapath 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 unbound from our chassis#033[00m
Jan 22 17:16:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:10.969 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:16:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:10.971 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6fda51ed-2e14-45b8-af2d-7a085fdd9dae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:10.972 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 namespace which is not needed anymore#033[00m
Jan 22 17:16:10 np0005592767 nova_compute[182623]: 2026-01-22 22:16:10.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 22 17:16:11 np0005592767 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 14.139s CPU time.
Jan 22 17:16:11 np0005592767 systemd-machined[153912]: Machine qemu-2-instance-00000004 terminated.
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [NOTICE]   (211733) : haproxy version is 2.8.14-c23fe91
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [NOTICE]   (211733) : path to executable is /usr/sbin/haproxy
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [WARNING]  (211733) : Exiting Master process...
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [WARNING]  (211733) : Exiting Master process...
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [ALERT]    (211733) : Current worker (211735) exited with code 143 (Terminated)
Jan 22 17:16:11 np0005592767 neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1[211729]: [WARNING]  (211733) : All workers exited. Exiting... (0)
Jan 22 17:16:11 np0005592767 systemd[1]: libpod-873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d.scope: Deactivated successfully.
Jan 22 17:16:11 np0005592767 podman[211828]: 2026-01-22 22:16:11.11806552 +0000 UTC m=+0.042260890 container died 873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:16:11 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:16:11 np0005592767 systemd[1]: var-lib-containers-storage-overlay-2268f99053a19625c0e39d332610113176567796e3920f5fbd1aa9ae2bc8e995-merged.mount: Deactivated successfully.
Jan 22 17:16:11 np0005592767 podman[211828]: 2026-01-22 22:16:11.161015338 +0000 UTC m=+0.085210698 container cleanup 873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:16:11 np0005592767 systemd[1]: libpod-conmon-873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d.scope: Deactivated successfully.
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.198 182627 INFO nova.virt.libvirt.driver [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Instance destroyed successfully.#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.198 182627 DEBUG nova.objects.instance [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lazy-loading 'resources' on Instance uuid 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.213 182627 DEBUG nova.virt.libvirt.vif [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-662460332-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-662460332-3',id=4,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2026-01-22T22:15:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5c10a85a248465c960e573d380cd07d',ramdisk_id='',reservation_id='r-0ajzb810',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-2038642621',owner_user_name='tempest-AutoAllocateNetworkTest-2038642621-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:15:57Z,user_data=None,user_id='d32f3e08e3df4d1ab5b54cafb9d93176',uuid=27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dcbcd25d-0015-4b14-a104-ee8da7621961", "address": "fa:16:3e:8b:59:ab", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::2f5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcbcd25d-00", "ovs_interfaceid": "dcbcd25d-0015-4b14-a104-ee8da7621961", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.214 182627 DEBUG nova.network.os_vif_util [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converting VIF {"id": "dcbcd25d-0015-4b14-a104-ee8da7621961", "address": "fa:16:3e:8b:59:ab", "network": {"id": "0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400:2::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:2::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:2::2f5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5c10a85a248465c960e573d380cd07d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcbcd25d-00", "ovs_interfaceid": "dcbcd25d-0015-4b14-a104-ee8da7621961", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.215 182627 DEBUG nova.network.os_vif_util [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:59:ab,bridge_name='br-int',has_traffic_filtering=True,id=dcbcd25d-0015-4b14-a104-ee8da7621961,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcbcd25d-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.215 182627 DEBUG os_vif [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:59:ab,bridge_name='br-int',has_traffic_filtering=True,id=dcbcd25d-0015-4b14-a104-ee8da7621961,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcbcd25d-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.219 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.219 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcbcd25d-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.221 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.222 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.226 182627 INFO os_vif [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:59:ab,bridge_name='br-int',has_traffic_filtering=True,id=dcbcd25d-0015-4b14-a104-ee8da7621961,network=Network(0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcbcd25d-00')#033[00m
Jan 22 17:16:11 np0005592767 podman[211867]: 2026-01-22 22:16:11.227149898 +0000 UTC m=+0.045002647 container remove 873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.227 182627 INFO nova.virt.libvirt.driver [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Deleting instance files /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a_del#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.228 182627 INFO nova.virt.libvirt.driver [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Deletion of /var/lib/nova/instances/27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a_del complete#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.235 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc1ad6e-f4e2-45a6-8a18-d67db41bced6]: (4, ('Thu Jan 22 10:16:11 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 (873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d)\n873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d\nThu Jan 22 10:16:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 (873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d)\n873184e54ac161bbac82aed42689268220fa581bf649523276486395b988658d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.237 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5059b9c5-8cc0-4d3e-812e-4415a33c458b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.238 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ae2c5c3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.239 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 kernel: tap0ae2c5c3-f0: left promiscuous mode
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.241 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.246 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b899d379-8e0c-463a-a1f3-b6c275d26313]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.252 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.261 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84a1c94f-5f2c-4f28-bbfc-6bb7f921aa70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.263 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d0205e-22c9-4c97-bc9e-7fc104a9987f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.283 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b5445790-e32b-47fd-91d6-b64a7eb09c7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375819, 'reachable_time': 18634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211890, 'error': None, 'target': 'ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 systemd[1]: run-netns-ovnmeta\x2d0ae2c5c3\x2df5e5\x2d49e2\x2db4c8\x2db3ced0d580c1.mount: Deactivated successfully.
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.297 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0ae2c5c3-f5e5-49e2-b4c8-b3ced0d580c1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:16:11 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:11.298 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[96ee098e-11d6-4da5-8906-231f849c2df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.316 182627 DEBUG nova.virt.libvirt.host [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.316 182627 INFO nova.virt.libvirt.host [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] UEFI support detected#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.318 182627 INFO nova.compute.manager [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.319 182627 DEBUG oslo.service.loopingcall [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.320 182627 DEBUG nova.compute.manager [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:16:11 np0005592767 nova_compute[182623]: 2026-01-22 22:16:11.320 182627 DEBUG nova.network.neutron [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:12.087 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:12.088 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:12.088 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.177 182627 DEBUG nova.network.neutron [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.194 182627 INFO nova.compute.manager [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.350 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.351 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.474 182627 DEBUG nova.compute.provider_tree [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.488 182627 DEBUG nova.scheduler.client.report [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.513 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.537 182627 INFO nova.scheduler.client.report [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Deleted allocations for instance 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.632 182627 DEBUG oslo_concurrency.lockutils [None req-479627ab-a0f0-45d7-bd20-0b8b766c1f0c d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.725 182627 DEBUG nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Received event network-vif-unplugged-dcbcd25d-0015-4b14-a104-ee8da7621961 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.726 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.726 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.726 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.726 182627 DEBUG nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] No waiting events found dispatching network-vif-unplugged-dcbcd25d-0015-4b14-a104-ee8da7621961 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.727 182627 WARNING nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Received unexpected event network-vif-unplugged-dcbcd25d-0015-4b14-a104-ee8da7621961 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.727 182627 DEBUG nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Received event network-vif-plugged-dcbcd25d-0015-4b14-a104-ee8da7621961 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.727 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.727 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.727 182627 DEBUG oslo_concurrency.lockutils [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.728 182627 DEBUG nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] No waiting events found dispatching network-vif-plugged-dcbcd25d-0015-4b14-a104-ee8da7621961 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.728 182627 WARNING nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Received unexpected event network-vif-plugged-dcbcd25d-0015-4b14-a104-ee8da7621961 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:16:12 np0005592767 nova_compute[182623]: 2026-01-22 22:16:12.728 182627 DEBUG nova.compute.manager [req-8eb5c223-081b-4aa6-828b-550729dce9aa req-faa0862f-fe47-45af-8607-a8dc54d8fbf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Received event network-vif-deleted-dcbcd25d-0015-4b14-a104-ee8da7621961 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:16:13 np0005592767 nova_compute[182623]: 2026-01-22 22:16:13.597 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:13.598 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:16:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:13.603 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:16:14 np0005592767 nova_compute[182623]: 2026-01-22 22:16:14.474 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:16 np0005592767 nova_compute[182623]: 2026-01-22 22:16:16.221 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:17 np0005592767 nova_compute[182623]: 2026-01-22 22:16:17.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:17 np0005592767 podman[211892]: 2026-01-22 22:16:17.149934275 +0000 UTC m=+0.057514639 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.476 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.886 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.886 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.887 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.887 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.887 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.899 182627 INFO nova.compute.manager [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Terminating instance#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.911 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "refresh_cache-935ae42e-b5f1-4d78-a90f-1ad8097b19ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.911 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquired lock "refresh_cache-935ae42e-b5f1-4d78-a90f-1ad8097b19ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:19 np0005592767 nova_compute[182623]: 2026-01-22 22:16:19.912 182627 DEBUG nova.network.neutron [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.265 182627 DEBUG nova.network.neutron [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.571 182627 DEBUG nova.network.neutron [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.591 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Releasing lock "refresh_cache-935ae42e-b5f1-4d78-a90f-1ad8097b19ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.592 182627 DEBUG nova.compute.manager [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:16:20 np0005592767 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 22 17:16:20 np0005592767 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.727s CPU time.
Jan 22 17:16:20 np0005592767 systemd-machined[153912]: Machine qemu-1-instance-00000001 terminated.
Jan 22 17:16:20 np0005592767 podman[211914]: 2026-01-22 22:16:20.752993825 +0000 UTC m=+0.077687826 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 22 17:16:20 np0005592767 podman[211913]: 2026-01-22 22:16:20.787460425 +0000 UTC m=+0.126453969 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.842 182627 INFO nova.virt.libvirt.driver [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Instance destroyed successfully.#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.843 182627 DEBUG nova.objects.instance [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lazy-loading 'resources' on Instance uuid 935ae42e-b5f1-4d78-a90f-1ad8097b19ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.857 182627 INFO nova.virt.libvirt.driver [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Deleting instance files /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab_del#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.857 182627 INFO nova.virt.libvirt.driver [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Deletion of /var/lib/nova/instances/935ae42e-b5f1-4d78-a90f-1ad8097b19ab_del complete#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.917 182627 INFO nova.compute.manager [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.918 182627 DEBUG oslo.service.loopingcall [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.918 182627 DEBUG nova.compute.manager [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:16:20 np0005592767 nova_compute[182623]: 2026-01-22 22:16:20.918 182627 DEBUG nova.network.neutron [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.154 182627 DEBUG nova.network.neutron [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.166 182627 DEBUG nova.network.neutron [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.191 182627 INFO nova.compute.manager [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Took 0.27 seconds to deallocate network for instance.#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.223 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.255 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.256 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.326 182627 DEBUG nova.compute.provider_tree [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.348 182627 DEBUG nova.scheduler.client.report [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.376 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.410 182627 INFO nova.scheduler.client.report [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Deleted allocations for instance 935ae42e-b5f1-4d78-a90f-1ad8097b19ab#033[00m
Jan 22 17:16:21 np0005592767 nova_compute[182623]: 2026-01-22 22:16:21.499 182627 DEBUG oslo_concurrency.lockutils [None req-d018f673-c8ac-44a3-a7ee-3945d2409a33 d32f3e08e3df4d1ab5b54cafb9d93176 a5c10a85a248465c960e573d380cd07d - - default default] Lock "935ae42e-b5f1-4d78-a90f-1ad8097b19ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:21.606 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:24 np0005592767 nova_compute[182623]: 2026-01-22 22:16:24.479 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:26 np0005592767 podman[211970]: 2026-01-22 22:16:26.122302882 +0000 UTC m=+0.048339251 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 17:16:26 np0005592767 nova_compute[182623]: 2026-01-22 22:16:26.197 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120171.1952863, 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:26 np0005592767 nova_compute[182623]: 2026-01-22 22:16:26.197 182627 INFO nova.compute.manager [-] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:16:26 np0005592767 nova_compute[182623]: 2026-01-22 22:16:26.214 182627 DEBUG nova.compute.manager [None req-b453f392-1fd9-468a-b1ef-1feb85a2d109 - - - - - -] [instance: 27b74ba8-6c06-4dd0-92f3-9e0bb16f7e5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:26 np0005592767 nova_compute[182623]: 2026-01-22 22:16:26.225 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:29 np0005592767 nova_compute[182623]: 2026-01-22 22:16:29.480 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:29 np0005592767 podman[211989]: 2026-01-22 22:16:29.587082801 +0000 UTC m=+0.070064772 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:16:31 np0005592767 nova_compute[182623]: 2026-01-22 22:16:31.227 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:34 np0005592767 nova_compute[182623]: 2026-01-22 22:16:34.483 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:35 np0005592767 podman[212014]: 2026-01-22 22:16:35.20706964 +0000 UTC m=+0.108270867 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:16:35 np0005592767 nova_compute[182623]: 2026-01-22 22:16:35.841 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120180.8392127, 935ae42e-b5f1-4d78-a90f-1ad8097b19ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:35 np0005592767 nova_compute[182623]: 2026-01-22 22:16:35.841 182627 INFO nova.compute.manager [-] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:16:35 np0005592767 nova_compute[182623]: 2026-01-22 22:16:35.863 182627 DEBUG nova.compute.manager [None req-0d7c7406-0ac9-44c6-9397-46525a40f5a7 - - - - - -] [instance: 935ae42e-b5f1-4d78-a90f-1ad8097b19ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:36 np0005592767 nova_compute[182623]: 2026-01-22 22:16:36.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:39 np0005592767 nova_compute[182623]: 2026-01-22 22:16:39.484 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:41 np0005592767 nova_compute[182623]: 2026-01-22 22:16:41.232 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.522 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating tmpfile /var/lib/nova/instances/tmp02xgsd7v to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.699 182627 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.727 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.727 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.733 182627 INFO nova.compute.rpcapi [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 22 17:16:44 np0005592767 nova_compute[182623]: 2026-01-22 22:16:44.734 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:46 np0005592767 nova_compute[182623]: 2026-01-22 22:16:46.234 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:46 np0005592767 nova_compute[182623]: 2026-01-22 22:16:46.713 182627 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 22 17:16:46 np0005592767 nova_compute[182623]: 2026-01-22 22:16:46.738 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:46 np0005592767 nova_compute[182623]: 2026-01-22 22:16:46.739 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:46 np0005592767 nova_compute[182623]: 2026-01-22 22:16:46.739 182627 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:16:48 np0005592767 podman[212037]: 2026-01-22 22:16:48.13680427 +0000 UTC m=+0.054871608 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.213 182627 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.230 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.240 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.241 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating instance directory: /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.241 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Creating disk.info with the contents: {'/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk': 'qcow2', '/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.242 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.242 182627 DEBUG nova.objects.instance [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.269 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.324 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.325 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.326 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.338 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.392 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.393 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.441 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.443 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.444 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.515 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.516 182627 DEBUG nova.virt.disk.api [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Checking if we can resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.517 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.572 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.573 182627 DEBUG nova.virt.disk.api [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Cannot resize image /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.573 182627 DEBUG nova.objects.instance [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.585 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.746 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config 485376" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.747 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config to /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:16:48 np0005592767 nova_compute[182623]: 2026-01-22 22:16:48.747 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.252 182627 DEBUG oslo_concurrency.processutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk.config /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.253 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.254 182627 DEBUG nova.virt.libvirt.vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:39Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.255 182627 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.255 182627 DEBUG nova.network.os_vif_util [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.256 182627 DEBUG os_vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.257 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.257 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.259 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.260 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cbb0272-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.260 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cbb0272-18, col_values=(('external_ids', {'iface-id': '3cbb0272-18e2-4845-aa69-d6a35ecb0d03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:6c:2e', 'vm-uuid': '1c2458ea-22d6-480f-ae75-5f050eb08b2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:49 np0005592767 NetworkManager[54973]: <info>  [1769120209.2626] manager: (tap3cbb0272-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.265 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.267 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.268 182627 INFO os_vif [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.269 182627 DEBUG nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.269 182627 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 22 17:16:49 np0005592767 nova_compute[182623]: 2026-01-22 22:16:49.487 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:50 np0005592767 nova_compute[182623]: 2026-01-22 22:16:50.712 182627 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 22 17:16:50 np0005592767 nova_compute[182623]: 2026-01-22 22:16:50.726 182627 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp02xgsd7v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 22 17:16:50 np0005592767 systemd[1]: Starting libvirt proxy daemon...
Jan 22 17:16:50 np0005592767 systemd[1]: Started libvirt proxy daemon.
Jan 22 17:16:50 np0005592767 podman[212080]: 2026-01-22 22:16:50.926878349 +0000 UTC m=+0.057028130 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64)
Jan 22 17:16:50 np0005592767 podman[212079]: 2026-01-22 22:16:50.973057435 +0000 UTC m=+0.097476851 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:16:51 np0005592767 kernel: tap3cbb0272-18: entered promiscuous mode
Jan 22 17:16:51 np0005592767 NetworkManager[54973]: <info>  [1769120211.0394] manager: (tap3cbb0272-18): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 22 17:16:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:51Z|00035|binding|INFO|Claiming lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for this additional chassis.
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:51Z|00036|binding|INFO|3cbb0272-18e2-4845-aa69-d6a35ecb0d03: Claiming fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 systemd-udevd[212160]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:51 np0005592767 systemd-machined[153912]: New machine qemu-3-instance-00000007.
Jan 22 17:16:51 np0005592767 NetworkManager[54973]: <info>  [1769120211.0815] device (tap3cbb0272-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:16:51 np0005592767 NetworkManager[54973]: <info>  [1769120211.0821] device (tap3cbb0272-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:51Z|00037|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 ovn-installed in OVS
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 nova_compute[182623]: 2026-01-22 22:16:51.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:51 np0005592767 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 22 17:16:52 np0005592767 nova_compute[182623]: 2026-01-22 22:16:52.225 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120212.2252731, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:52 np0005592767 nova_compute[182623]: 2026-01-22 22:16:52.227 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Started (Lifecycle Event)#033[00m
Jan 22 17:16:52 np0005592767 nova_compute[182623]: 2026-01-22 22:16:52.248 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.062 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120213.061613, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.063 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.081 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.085 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.103 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.135 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.136 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.155 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.245 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.246 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.252 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.252 182627 INFO nova.compute.claims [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.398 182627 DEBUG nova.compute.provider_tree [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.414 182627 DEBUG nova.scheduler.client.report [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.435 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.436 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.485 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.486 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.504 182627 INFO nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.525 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.669 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.670 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.670 182627 INFO nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Creating image(s)#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.671 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.671 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.672 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.683 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.746 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.747 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.747 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.758 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.819 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.820 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.890 182627 DEBUG nova.policy [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a84b864eafb4f74a43b72cf303742cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce818300105f44b6abd8aa2b62699bda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.923 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk 1073741824" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.924 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.924 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.996 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.998 182627 DEBUG nova.virt.disk.api [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Checking if we can resize image /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:16:53 np0005592767 nova_compute[182623]: 2026-01-22 22:16:53.998 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.072 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.073 182627 DEBUG nova.virt.disk.api [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Cannot resize image /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.073 182627 DEBUG nova.objects.instance [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'migration_context' on Instance uuid 2b3fa714-6109-48db-878b-f5e3d1420dba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.087 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.087 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Ensure instance console log exists: /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.088 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.088 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.088 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.266 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:54Z|00038|binding|INFO|Claiming lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for this chassis.
Jan 22 17:16:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:54Z|00039|binding|INFO|3cbb0272-18e2-4845-aa69-d6a35ecb0d03: Claiming fa:16:3e:8f:6c:2e 10.100.0.9
Jan 22 17:16:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:54Z|00040|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 up in Southbound
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.359 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.360 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b bound to our chassis#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.362 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.374 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d56856b8-0a5d-48ac-95dc-805840953c17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.375 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0265f228-41 in ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.377 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0265f228-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.377 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2d406e-65c6-4a25-8aac-34862c41cb48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.378 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf67f51-19e6-45d7-977c-963a30fa6338]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.388 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0417a7-2b24-4e14-b49e-72586789168a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.401 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8ad869-2f40-4e25-9028-f7e3f90e693f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.438 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf022aa-bcb0-49b1-9202-2b700c8b40bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 NetworkManager[54973]: <info>  [1769120214.4454] manager: (tap0265f228-40): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.444 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1cf0a3-08a0-45bf-aa64-aea998ec5de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.486 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3de1fe-9ab5-4a8a-8438-de2b8d4ce146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.491 182627 INFO nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Post operation of migration started#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.492 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc1aa5d-a7e7-4c57-b0ff-155366ee3d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.495 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:54 np0005592767 systemd-udevd[212204]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:54 np0005592767 NetworkManager[54973]: <info>  [1769120214.5263] device (tap0265f228-40): carrier: link connected
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.537 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4eddf5-fde7-4d82-9606-9d4617a53bb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.563 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[90c2ed72-06eb-45ed-b34c-b68713378830]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381110, 'reachable_time': 32828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212223, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.589 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6433e2b7-7f9e-485d-9e0a-887e05555496]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8003'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381110, 'tstamp': 381110}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212224, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.614 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a990c96d-aa9c-4528-9eee-6487ded13b37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381110, 'reachable_time': 32828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212225, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.632 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Successfully created port: 3c448812-44ed-4bd0-8b7d-3ad276e28d30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.653 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e5dcecb1-4c48-4dc3-b3aa-3d8ce3600457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.712 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a707f9-b8e1-4aa4-bc3e-8da67a380862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.714 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.714 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.714 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:54 np0005592767 NetworkManager[54973]: <info>  [1769120214.7176] manager: (tap0265f228-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 22 17:16:54 np0005592767 kernel: tap0265f228-40: entered promiscuous mode
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.721 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:54Z|00041|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.722 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.725 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.726 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d12120-a589-4576-83a8-6ec279166420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.727 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:16:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:54.727 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'env', 'PROCESS_TAG=haproxy-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0265f228-4e11-4f15-8d77-6acb409f3f7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.734 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.990 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.990 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:54 np0005592767 nova_compute[182623]: 2026-01-22 22:16:54.990 182627 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:16:55 np0005592767 podman[212258]: 2026-01-22 22:16:55.139674466 +0000 UTC m=+0.057305219 container create 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:16:55 np0005592767 systemd[1]: Started libpod-conmon-3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740.scope.
Jan 22 17:16:55 np0005592767 podman[212258]: 2026-01-22 22:16:55.108293238 +0000 UTC m=+0.025924011 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:16:55 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:16:55 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/153e53a0033a2f74397026861fe2415addc3c94bf42a8283ea1ba87e09a49bf1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:16:55 np0005592767 podman[212258]: 2026-01-22 22:16:55.240954676 +0000 UTC m=+0.158585479 container init 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:16:55 np0005592767 podman[212258]: 2026-01-22 22:16:55.252374407 +0000 UTC m=+0.170005180 container start 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:16:55 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [NOTICE]   (212277) : New worker (212279) forked
Jan 22 17:16:55 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [NOTICE]   (212277) : Loading success.
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.678 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Successfully updated port: 3c448812-44ed-4bd0-8b7d-3ad276e28d30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.693 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.693 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquired lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.693 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.773 182627 DEBUG nova.compute.manager [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-changed-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.773 182627 DEBUG nova.compute.manager [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Refreshing instance network info cache due to event network-changed-3c448812-44ed-4bd0-8b7d-3ad276e28d30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.773 182627 DEBUG oslo_concurrency.lockutils [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:16:55 np0005592767 nova_compute[182623]: 2026-01-22 22:16:55.917 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.429 182627 DEBUG nova.network.neutron [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.446 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.467 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.468 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.468 182627 DEBUG oslo_concurrency.lockutils [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.474 182627 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 22 17:16:56 np0005592767 virtqemud[182095]: Domain id=3 name='instance-00000007' uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b is tainted: custom-monitor
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.679 182627 DEBUG nova.network.neutron [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updating instance_info_cache with network_info: [{"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.700 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Releasing lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.700 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Instance network_info: |[{"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.701 182627 DEBUG oslo_concurrency.lockutils [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.701 182627 DEBUG nova.network.neutron [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Refreshing network info cache for port 3c448812-44ed-4bd0-8b7d-3ad276e28d30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.705 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Start _get_guest_xml network_info=[{"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.709 182627 WARNING nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.713 182627 DEBUG nova.virt.libvirt.host [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.714 182627 DEBUG nova.virt.libvirt.host [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.721 182627 DEBUG nova.virt.libvirt.host [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.722 182627 DEBUG nova.virt.libvirt.host [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.723 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.724 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:16:47Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='963106166',id=27,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-2125659226',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.724 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.725 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.725 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.725 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.725 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.726 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.726 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.726 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.727 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.727 182627 DEBUG nova.virt.hardware [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.732 182627 DEBUG nova.virt.libvirt.vif [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(27),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-373463314',id=9,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=27,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-on1n29m8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=2b3fa714-6109-48db-878b-f5e3d1420dba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.733 182627 DEBUG nova.network.os_vif_util [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.733 182627 DEBUG nova.network.os_vif_util [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.735 182627 DEBUG nova.objects.instance [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b3fa714-6109-48db-878b-f5e3d1420dba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.752 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <uuid>2b3fa714-6109-48db-878b-f5e3d1420dba</uuid>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <name>instance-00000009</name>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-373463314</nova:name>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:16:56</nova:creationTime>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:flavor name="tempest-flavor_with_ephemeral_0-2125659226">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:user uuid="5a84b864eafb4f74a43b72cf303742cc">tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member</nova:user>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:project uuid="ce818300105f44b6abd8aa2b62699bda">tempest-ServersWithSpecificFlavorTestJSON-469507991</nova:project>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        <nova:port uuid="3c448812-44ed-4bd0-8b7d-3ad276e28d30">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="serial">2b3fa714-6109-48db-878b-f5e3d1420dba</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="uuid">2b3fa714-6109-48db-878b-f5e3d1420dba</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.config"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:0c:f5:6e"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <target dev="tap3c448812-44"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/console.log" append="off"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:16:56 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:16:56 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:16:56 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:16:56 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.753 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Preparing to wait for external event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.753 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.754 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.754 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.754 182627 DEBUG nova.virt.libvirt.vif [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(27),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-373463314',id=9,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=27,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-on1n29m8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:16:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=2b3fa714-6109-48db-878b-f5e3d1420dba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.755 182627 DEBUG nova.network.os_vif_util [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.755 182627 DEBUG nova.network.os_vif_util [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.756 182627 DEBUG os_vif [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.756 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.756 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.757 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.759 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.760 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c448812-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.760 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c448812-44, col_values=(('external_ids', {'iface-id': '3c448812-44ed-4bd0-8b7d-3ad276e28d30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:f5:6e', 'vm-uuid': '2b3fa714-6109-48db-878b-f5e3d1420dba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:56 np0005592767 NetworkManager[54973]: <info>  [1769120216.7630] manager: (tap3c448812-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.764 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.769 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.771 182627 INFO os_vif [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44')#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.830 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.830 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.830 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No VIF found with MAC fa:16:3e:0c:f5:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:16:56 np0005592767 nova_compute[182623]: 2026-01-22 22:16:56.831 182627 INFO nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Using config drive#033[00m
Jan 22 17:16:56 np0005592767 podman[212291]: 2026-01-22 22:16:56.861536491 +0000 UTC m=+0.055837046 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.233 182627 INFO nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Creating config drive at /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.config#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.238 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphvhkwj01 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.367 182627 DEBUG oslo_concurrency.processutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphvhkwj01" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:16:57 np0005592767 kernel: tap3c448812-44: entered promiscuous mode
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.4135] manager: (tap3c448812-44): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Jan 22 17:16:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:57Z|00042|binding|INFO|Claiming lport 3c448812-44ed-4bd0-8b7d-3ad276e28d30 for this chassis.
Jan 22 17:16:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:57Z|00043|binding|INFO|3c448812-44ed-4bd0-8b7d-3ad276e28d30: Claiming fa:16:3e:0c:f5:6e 10.100.0.10
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.414 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 systemd-udevd[212216]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.416 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.421 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.4295] device (tap3c448812-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.4314] device (tap3c448812-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.433 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:f5:6e 10.100.0.10'], port_security=['fa:16:3e:0c:f5:6e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2b3fa714-6109-48db-878b-f5e3d1420dba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31d2ab9-cf9f-454b-9df5-065776293667', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce818300105f44b6abd8aa2b62699bda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d5f0cd7-7223-4821-8ca2-c6a12fc0cebe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be6ad440-24bb-413c-a01d-43bdddaaa797, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3c448812-44ed-4bd0-8b7d-3ad276e28d30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.434 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3c448812-44ed-4bd0-8b7d-3ad276e28d30 in datapath a31d2ab9-cf9f-454b-9df5-065776293667 bound to our chassis#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.436 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a31d2ab9-cf9f-454b-9df5-065776293667#033[00m
Jan 22 17:16:57 np0005592767 systemd-machined[153912]: New machine qemu-4-instance-00000009.
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.453 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c61b54f-14ec-4034-8671-1a87cc903b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.455 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa31d2ab9-c1 in ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.456 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa31d2ab9-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.456 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[10575cdb-aa12-4a48-828b-84d062936bda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.457 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5af7cb-99c1-43f5-8b55-101be40340ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.467 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e8097e9e-37dd-411c-a529-b14aac8be0f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:57Z|00044|binding|INFO|Setting lport 3c448812-44ed-4bd0-8b7d-3ad276e28d30 ovn-installed in OVS
Jan 22 17:16:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:57Z|00045|binding|INFO|Setting lport 3c448812-44ed-4bd0-8b7d-3ad276e28d30 up in Southbound
Jan 22 17:16:57 np0005592767 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.474 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.481 182627 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.480 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[877ab46e-e048-46e3-9311-b72dd0e75b6f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.509 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba440e4-a8ef-4ef9-b4a9-cf67088742f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.5187] manager: (tapa31d2ab9-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.518 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[22b426f8-8fc8-4b0d-ac01-307d2caaa6c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.544 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3c6261-c327-451e-b23e-46aea69a0cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.548 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c218ce19-2e93-4dd8-97c6-76058bffb8c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 systemd-udevd[212340]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.5725] device (tapa31d2ab9-c0): carrier: link connected
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.580 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8aed89-995d-438e-b61b-8e44ff5a7863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.600 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d63a61e-7859-412c-a458-ed3cec0bf041]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa31d2ab9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:73:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381415, 'reachable_time': 42761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212360, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.616 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a114e7f-679d-42b1-8cf0-28c001bce74c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:7381'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381415, 'tstamp': 381415}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212361, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.641 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1335e23e-c3bf-46a6-bdaa-ab58b6138f35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa31d2ab9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:73:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381415, 'reachable_time': 42761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212363, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.678 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[69f99575-b7b8-4c32-8bb5-87ae5b1b9caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.735 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61772d6a-975c-4894-84ed-ba55e6b430c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.738 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa31d2ab9-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.739 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.740 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa31d2ab9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:57 np0005592767 NetworkManager[54973]: <info>  [1769120217.7431] manager: (tapa31d2ab9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.743 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 kernel: tapa31d2ab9-c0: entered promiscuous mode
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.745 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.746 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa31d2ab9-c0, col_values=(('external_ids', {'iface-id': '06d3bb0f-011c-43a0-a675-327a84bfa758'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:16:57Z|00046|binding|INFO|Releasing lport 06d3bb0f-011c-43a0-a675-327a84bfa758 from this chassis (sb_readonly=0)
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.761 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.762 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cf292a0b-0e9a-4da1-9e4c-aae14a37b977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.763 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-a31d2ab9-cf9f-454b-9df5-065776293667
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID a31d2ab9-cf9f-454b-9df5-065776293667
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:16:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:16:57.763 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'env', 'PROCESS_TAG=haproxy-a31d2ab9-cf9f-454b-9df5-065776293667', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a31d2ab9-cf9f-454b-9df5-065776293667.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.837 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120217.8358278, 2b3fa714-6109-48db-878b-f5e3d1420dba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.837 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] VM Started (Lifecycle Event)#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.875 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.880 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120217.8362706, 2b3fa714-6109-48db-878b-f5e3d1420dba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.880 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.900 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.904 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.914 182627 DEBUG nova.compute.manager [req-18ff5bd0-6a6a-4fc6-b8d6-7c3a38ab1b49 req-fc9c2561-2ee1-4019-9b55-72977602567f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.914 182627 DEBUG oslo_concurrency.lockutils [req-18ff5bd0-6a6a-4fc6-b8d6-7c3a38ab1b49 req-fc9c2561-2ee1-4019-9b55-72977602567f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.915 182627 DEBUG oslo_concurrency.lockutils [req-18ff5bd0-6a6a-4fc6-b8d6-7c3a38ab1b49 req-fc9c2561-2ee1-4019-9b55-72977602567f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.915 182627 DEBUG oslo_concurrency.lockutils [req-18ff5bd0-6a6a-4fc6-b8d6-7c3a38ab1b49 req-fc9c2561-2ee1-4019-9b55-72977602567f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.915 182627 DEBUG nova.compute.manager [req-18ff5bd0-6a6a-4fc6-b8d6-7c3a38ab1b49 req-fc9c2561-2ee1-4019-9b55-72977602567f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Processing event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.916 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.920 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.923 182627 INFO nova.virt.libvirt.driver [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Instance spawned successfully.#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.924 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.937 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.938 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120217.9203475, 2b3fa714-6109-48db-878b-f5e3d1420dba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.938 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.963 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.964 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.964 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.964 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.965 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.965 182627 DEBUG nova.virt.libvirt.driver [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.975 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:57 np0005592767 nova_compute[182623]: 2026-01-22 22:16:57.977 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.009 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.077 182627 INFO nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Took 4.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.078 182627 DEBUG nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.164 182627 INFO nova.compute.manager [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Took 4.96 seconds to build instance.#033[00m
Jan 22 17:16:58 np0005592767 podman[212401]: 2026-01-22 22:16:58.173983151 +0000 UTC m=+0.052661925 container create b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.176 182627 DEBUG nova.network.neutron [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updated VIF entry in instance network info cache for port 3c448812-44ed-4bd0-8b7d-3ad276e28d30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.177 182627 DEBUG nova.network.neutron [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updating instance_info_cache with network_info: [{"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.181 182627 DEBUG oslo_concurrency.lockutils [None req-8f4f49d4-6e81-41ee-b115-d5718b949120 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.191 182627 DEBUG oslo_concurrency.lockutils [req-0a21419e-93e1-463a-af58-553643a1ac7d req-410ca3e9-3579-4452-a323-2c026a7abf9a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:16:58 np0005592767 systemd[1]: Started libpod-conmon-b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde.scope.
Jan 22 17:16:58 np0005592767 podman[212401]: 2026-01-22 22:16:58.147460733 +0000 UTC m=+0.026139537 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:16:58 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:16:58 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4084653e40aca75be4161705e52841198e423aa95597859e8026a916104eee0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:16:58 np0005592767 podman[212401]: 2026-01-22 22:16:58.265516849 +0000 UTC m=+0.144195633 container init b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:16:58 np0005592767 podman[212401]: 2026-01-22 22:16:58.275958251 +0000 UTC m=+0.154637045 container start b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:16:58 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [NOTICE]   (212420) : New worker (212422) forked
Jan 22 17:16:58 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [NOTICE]   (212420) : Loading success.
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.491 182627 INFO nova.virt.libvirt.driver [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.498 182627 DEBUG nova.compute.manager [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:16:58 np0005592767 nova_compute[182623]: 2026-01-22 22:16:58.516 182627 DEBUG nova.objects.instance [None req-17d54b8f-4444-47cc-bed7-e20e5525f9f8 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:16:59 np0005592767 nova_compute[182623]: 2026-01-22 22:16:59.493 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.043 182627 DEBUG nova.compute.manager [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.043 182627 DEBUG oslo_concurrency.lockutils [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.044 182627 DEBUG oslo_concurrency.lockutils [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.044 182627 DEBUG oslo_concurrency.lockutils [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.044 182627 DEBUG nova.compute.manager [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] No waiting events found dispatching network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:00 np0005592767 nova_compute[182623]: 2026-01-22 22:17:00.044 182627 WARNING nova.compute.manager [req-c79683e7-89ab-4283-bf23-32370c866274 req-8cb9a3fb-3813-4ecd-8263-ea422b913b23 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received unexpected event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:17:00 np0005592767 podman[212431]: 2026-01-22 22:17:00.182176898 +0000 UTC m=+0.088738618 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:17:01 np0005592767 nova_compute[182623]: 2026-01-22 22:17:01.763 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8416] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/33)
Jan 22 17:17:01 np0005592767 nova_compute[182623]: 2026-01-22 22:17:01.841 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8424] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <warn>  [1769120221.8425] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8434] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/34)
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8438] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <warn>  [1769120221.8439] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8447] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8457] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8461] device (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 17:17:01 np0005592767 NetworkManager[54973]: <info>  [1769120221.8465] device (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 22 17:17:01 np0005592767 nova_compute[182623]: 2026-01-22 22:17:01.941 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:01Z|00047|binding|INFO|Releasing lport 06d3bb0f-011c-43a0-a675-327a84bfa758 from this chassis (sb_readonly=0)
Jan 22 17:17:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:01Z|00048|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 17:17:01 np0005592767 nova_compute[182623]: 2026-01-22 22:17:01.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:03 np0005592767 nova_compute[182623]: 2026-01-22 22:17:03.261 182627 DEBUG nova.compute.manager [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-changed-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:03 np0005592767 nova_compute[182623]: 2026-01-22 22:17:03.262 182627 DEBUG nova.compute.manager [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Refreshing instance network info cache due to event network-changed-3c448812-44ed-4bd0-8b7d-3ad276e28d30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:17:03 np0005592767 nova_compute[182623]: 2026-01-22 22:17:03.262 182627 DEBUG oslo_concurrency.lockutils [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:03 np0005592767 nova_compute[182623]: 2026-01-22 22:17:03.263 182627 DEBUG oslo_concurrency.lockutils [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:03 np0005592767 nova_compute[182623]: 2026-01-22 22:17:03.263 182627 DEBUG nova.network.neutron [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Refreshing network info cache for port 3c448812-44ed-4bd0-8b7d-3ad276e28d30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.498 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.526 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Check if temp file /var/lib/nova/instances/tmpjb5ww5kl exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.527 182627 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.747 182627 DEBUG nova.network.neutron [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updated VIF entry in instance network info cache for port 3c448812-44ed-4bd0-8b7d-3ad276e28d30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.748 182627 DEBUG nova.network.neutron [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updating instance_info_cache with network_info: [{"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:04 np0005592767 nova_compute[182623]: 2026-01-22 22:17:04.814 182627 DEBUG oslo_concurrency.lockutils [req-dbf19d09-457b-4c15-aa46-76396cd60f7c req-c1cc8501-b306-4ff4-8234-7e24de7779c6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2b3fa714-6109-48db-878b-f5e3d1420dba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:05 np0005592767 nova_compute[182623]: 2026-01-22 22:17:05.585 182627 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:05 np0005592767 nova_compute[182623]: 2026-01-22 22:17:05.648 182627 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:05 np0005592767 nova_compute[182623]: 2026-01-22 22:17:05.650 182627 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:05 np0005592767 nova_compute[182623]: 2026-01-22 22:17:05.710 182627 DEBUG oslo_concurrency.processutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:06 np0005592767 podman[212462]: 2026-01-22 22:17:06.173050366 +0000 UTC m=+0.086712870 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:17:06 np0005592767 nova_compute[182623]: 2026-01-22 22:17:06.765 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:06 np0005592767 nova_compute[182623]: 2026-01-22 22:17:06.962 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:06 np0005592767 nova_compute[182623]: 2026-01-22 22:17:06.963 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:06 np0005592767 nova_compute[182623]: 2026-01-22 22:17:06.985 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:06 np0005592767 nova_compute[182623]: 2026-01-22 22:17:06.986 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.026 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.028 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.029 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.029 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.030 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.031 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.032 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:07 np0005592767 nova_compute[182623]: 2026-01-22 22:17:07.033 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:17:08 np0005592767 systemd-logind[802]: New session 27 of user nova.
Jan 22 17:17:08 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:17:08 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:17:08 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:17:08 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:17:08 np0005592767 systemd[212495]: Queued start job for default target Main User Target.
Jan 22 17:17:08 np0005592767 systemd[212495]: Created slice User Application Slice.
Jan 22 17:17:08 np0005592767 systemd[212495]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:17:08 np0005592767 systemd[212495]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:17:08 np0005592767 systemd[212495]: Reached target Paths.
Jan 22 17:17:08 np0005592767 systemd[212495]: Reached target Timers.
Jan 22 17:17:08 np0005592767 systemd[212495]: Starting D-Bus User Message Bus Socket...
Jan 22 17:17:08 np0005592767 systemd[212495]: Starting Create User's Volatile Files and Directories...
Jan 22 17:17:08 np0005592767 systemd[212495]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:17:08 np0005592767 systemd[212495]: Reached target Sockets.
Jan 22 17:17:08 np0005592767 nova_compute[182623]: 2026-01-22 22:17:08.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:17:08 np0005592767 systemd[212495]: Finished Create User's Volatile Files and Directories.
Jan 22 17:17:08 np0005592767 systemd[212495]: Reached target Basic System.
Jan 22 17:17:08 np0005592767 systemd[212495]: Reached target Main User Target.
Jan 22 17:17:08 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:17:08 np0005592767 systemd[212495]: Startup finished in 174ms.
Jan 22 17:17:08 np0005592767 systemd[1]: Started Session 27 of User nova.
Jan 22 17:17:08 np0005592767 nova_compute[182623]: 2026-01-22 22:17:08.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:08 np0005592767 nova_compute[182623]: 2026-01-22 22:17:08.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:08 np0005592767 nova_compute[182623]: 2026-01-22 22:17:08.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:08 np0005592767 nova_compute[182623]: 2026-01-22 22:17:08.927 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:17:08 np0005592767 systemd[1]: session-27.scope: Deactivated successfully.
Jan 22 17:17:08 np0005592767 systemd-logind[802]: Session 27 logged out. Waiting for processes to exit.
Jan 22 17:17:08 np0005592767 systemd-logind[802]: Removed session 27.
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.014 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.089 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.091 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.148 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.152 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.207 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.208 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.265 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.416 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.417 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5424MB free_disk=73.35232162475586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.418 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.418 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.470 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating resource usage from migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.497 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.501 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 2b3fa714-6109-48db-878b-f5e3d1420dba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.501 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.501 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.501 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.572 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.584 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.600 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:17:09 np0005592767 nova_compute[182623]: 2026-01-22 22:17:09.600 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.417 182627 DEBUG nova.compute.manager [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.417 182627 DEBUG oslo_concurrency.lockutils [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.418 182627 DEBUG oslo_concurrency.lockutils [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.418 182627 DEBUG oslo_concurrency.lockutils [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.419 182627 DEBUG nova.compute.manager [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:10 np0005592767 nova_compute[182623]: 2026-01-22 22:17:10.419 182627 DEBUG nova.compute.manager [req-2adaf39b-1893-4af5-8bbd-3fbaee32b745 req-3ade1a0c-51a3-4de1-b283-921b227ecee3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:17:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:11Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0c:f5:6e 10.100.0.10
Jan 22 17:17:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:11Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:f5:6e 10.100.0.10
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.769 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.864 182627 INFO nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Took 6.15 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.864 182627 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.886 182627 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjb5ww5kl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c2458ea-22d6-480f-ae75-5f050eb08b2b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(212b821b-cc9d-4094-bb7b-b23ad6071dc7),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.907 182627 DEBUG nova.objects.instance [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c2458ea-22d6-480f-ae75-5f050eb08b2b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.908 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.910 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.910 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.923 182627 DEBUG nova.virt.libvirt.vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:16:58Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.923 182627 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.924 182627 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.925 182627 DEBUG nova.virt.libvirt.migration [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 17:17:11 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:8f:6c:2e"/>
Jan 22 17:17:11 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:17:11 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:11 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:17:11 np0005592767 nova_compute[182623]:  <target dev="tap3cbb0272-18"/>
Jan 22 17:17:11 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:17:11 np0005592767 nova_compute[182623]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 22 17:17:11 np0005592767 nova_compute[182623]: 2026-01-22 22:17:11.926 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 22 17:17:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:12.089 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:12.090 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:12.091 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.414 182627 DEBUG nova.virt.libvirt.migration [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.415 182627 INFO nova.virt.libvirt.migration [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.910 182627 DEBUG nova.compute.manager [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.910 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.911 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.911 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.912 182627 DEBUG nova.compute.manager [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.912 182627 WARNING nova.compute.manager [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.913 182627 DEBUG nova.compute.manager [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.913 182627 DEBUG nova.compute.manager [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing instance network info cache due to event network-changed-3cbb0272-18e2-4845-aa69-d6a35ecb0d03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.913 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.914 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:12 np0005592767 nova_compute[182623]: 2026-01-22 22:17:12.914 182627 DEBUG nova.network.neutron [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Refreshing network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.099 182627 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.603 182627 DEBUG nova.virt.libvirt.migration [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.604 182627 DEBUG nova.virt.libvirt.migration [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.685 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120233.68465, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.685 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.701 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.705 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.727 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 22 17:17:13 np0005592767 kernel: tap3cbb0272-18 (unregistering): left promiscuous mode
Jan 22 17:17:13 np0005592767 NetworkManager[54973]: <info>  [1769120233.8427] device (tap3cbb0272-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:17:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:13Z|00049|binding|INFO|Releasing lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 from this chassis (sb_readonly=0)
Jan 22 17:17:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:13Z|00050|binding|INFO|Setting lport 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 down in Southbound
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.847 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:13Z|00051|binding|INFO|Removing iface tap3cbb0272-18 ovn-installed in OVS
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.849 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:13.855 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:6c:2e 10.100.0.9'], port_security=['fa:16:3e:8f:6c:2e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1c2458ea-22d6-480f-ae75-5f050eb08b2b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '18', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3cbb0272-18e2-4845-aa69-d6a35ecb0d03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:13.856 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis#033[00m
Jan 22 17:17:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:13.857 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0265f228-4e11-4f15-8d77-6acb409f3f7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:17:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:13.859 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0444282d-7bc1-4a67-a51d-58d365105143]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:13.860 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace which is not needed anymore#033[00m
Jan 22 17:17:13 np0005592767 nova_compute[182623]: 2026-01-22 22:17:13.865 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:13 np0005592767 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 22 17:17:13 np0005592767 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 3.112s CPU time.
Jan 22 17:17:13 np0005592767 systemd-machined[153912]: Machine qemu-3-instance-00000007 terminated.
Jan 22 17:17:14 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [NOTICE]   (212277) : haproxy version is 2.8.14-c23fe91
Jan 22 17:17:14 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [NOTICE]   (212277) : path to executable is /usr/sbin/haproxy
Jan 22 17:17:14 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [WARNING]  (212277) : Exiting Master process...
Jan 22 17:17:14 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [ALERT]    (212277) : Current worker (212279) exited with code 143 (Terminated)
Jan 22 17:17:14 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[212273]: [WARNING]  (212277) : All workers exited. Exiting... (0)
Jan 22 17:17:14 np0005592767 systemd[1]: libpod-3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740.scope: Deactivated successfully.
Jan 22 17:17:14 np0005592767 podman[212566]: 2026-01-22 22:17:14.013397569 +0000 UTC m=+0.046792525 container died 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:17:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740-userdata-shm.mount: Deactivated successfully.
Jan 22 17:17:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay-153e53a0033a2f74397026861fe2415addc3c94bf42a8283ea1ba87e09a49bf1-merged.mount: Deactivated successfully.
Jan 22 17:17:14 np0005592767 podman[212566]: 2026-01-22 22:17:14.057264078 +0000 UTC m=+0.090659034 container cleanup 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:17:14 np0005592767 systemd[1]: libpod-conmon-3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740.scope: Deactivated successfully.
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.082 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.082 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.083 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.106 182627 DEBUG nova.virt.libvirt.guest [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1c2458ea-22d6-480f-ae75-5f050eb08b2b' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.107 182627 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migration operation has completed#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.107 182627 INFO nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] _post_live_migration() is started..#033[00m
Jan 22 17:17:14 np0005592767 podman[212611]: 2026-01-22 22:17:14.125721949 +0000 UTC m=+0.042548572 container remove 3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.130 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bf131087-ec2f-4059-a2a4-7f76a184209b]: (4, ('Thu Jan 22 10:17:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740)\n3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740\nThu Jan 22 10:17:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740)\n3f0675b0d36986089f363a55d4961638fbae09abdd841ed1724313f101a7d740\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.132 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4fb88c-537a-47a7-865c-2f80dec8f926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.132 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.134 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 kernel: tap0265f228-40: left promiscuous mode
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.149 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.152 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a748d207-37f4-4579-a83f-d431dfa0ca3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.167 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8670c0db-f961-4b23-930a-d73923a665e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.168 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5889a0a2-fac9-49af-b793-b61f49edf29c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.177 182627 DEBUG nova.network.neutron [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updated VIF entry in instance network info cache for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.178 182627 DEBUG nova.network.neutron [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Updating instance_info_cache with network_info: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.186 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbf6e49-bf34-4284-ab0e-44d55f5083a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381100, 'reachable_time': 38704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212632, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 systemd[1]: run-netns-ovnmeta\x2d0265f228\x2d4e11\x2d4f15\x2d8d77\x2d6acb409f3f7b.mount: Deactivated successfully.
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.189 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.189 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8966cb16-0ffe-49d5-9297-20a657a7300c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.204 182627 DEBUG oslo_concurrency.lockutils [req-ce2c7892-49ec-4e12-8d6f-88474e6caa4a req-b782798c-5830-4443-9ec0-ea000e8c9061 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1c2458ea-22d6-480f-ae75-5f050eb08b2b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.369 182627 DEBUG nova.compute.manager [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.369 182627 DEBUG oslo_concurrency.lockutils [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.369 182627 DEBUG oslo_concurrency.lockutils [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.369 182627 DEBUG oslo_concurrency.lockutils [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.370 182627 DEBUG nova.compute.manager [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.370 182627 DEBUG nova.compute.manager [req-eba54c51-a675-4cf4-a0f4-0628ebfa522a req-e2ddf724-caef-4cb8-b80b-bef788b6e5ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.503 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.847 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:14.849 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.885 182627 DEBUG nova.network.neutron [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Activated binding for port 3cbb0272-18e2-4845-aa69-d6a35ecb0d03 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.886 182627 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.887 182627 DEBUG nova.virt.libvirt.vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:16:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-489483157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-489483157',id=7,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-p7iykyyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:17:03Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=1c2458ea-22d6-480f-ae75-5f050eb08b2b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.887 182627 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "address": "fa:16:3e:8f:6c:2e", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cbb0272-18", "ovs_interfaceid": "3cbb0272-18e2-4845-aa69-d6a35ecb0d03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.888 182627 DEBUG nova.network.os_vif_util [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.888 182627 DEBUG os_vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.891 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cbb0272-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.892 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.893 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.896 182627 INFO os_vif [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:6c:2e,bridge_name='br-int',has_traffic_filtering=True,id=3cbb0272-18e2-4845-aa69-d6a35ecb0d03,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cbb0272-18')#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.896 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.896 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.896 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.897 182627 DEBUG nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.897 182627 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deleting instance files /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del#033[00m
Jan 22 17:17:14 np0005592767 nova_compute[182623]: 2026-01-22 22:17:14.898 182627 INFO nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Deletion of /var/lib/nova/instances/1c2458ea-22d6-480f-ae75-5f050eb08b2b_del complete#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.005 182627 DEBUG nova.compute.manager [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.006 182627 DEBUG oslo_concurrency.lockutils [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.006 182627 DEBUG oslo_concurrency.lockutils [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.007 182627 DEBUG oslo_concurrency.lockutils [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.007 182627 DEBUG nova.compute.manager [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:15 np0005592767 nova_compute[182623]: 2026-01-22 22:17:15.008 182627 DEBUG nova.compute.manager [req-6619f831-dd06-406c-8377-4f7bca784d64 req-fc5e8623-400a-4c48-9075-03bbc6e049db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-unplugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.447 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.447 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.447 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.448 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.448 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.448 182627 WARNING nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.449 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.449 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.449 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.449 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.450 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.450 182627 WARNING nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.450 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.451 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.451 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.451 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.451 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.452 182627 WARNING nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.452 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.452 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.452 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.453 182627 DEBUG oslo_concurrency.lockutils [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.453 182627 DEBUG nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] No waiting events found dispatching network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:16 np0005592767 nova_compute[182623]: 2026-01-22 22:17:16.453 182627 WARNING nova.compute.manager [req-3f0626ce-dc15-44f4-ac66-6a31d5f72da0 req-77372b28-e33f-484e-9dc0-e96e6f964b50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Received unexpected event network-vif-plugged-3cbb0272-18e2-4845-aa69-d6a35ecb0d03 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:17:19 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:17:19 np0005592767 systemd[212495]: Activating special unit Exit the Session...
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped target Main User Target.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped target Basic System.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped target Paths.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped target Sockets.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped target Timers.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:17:19 np0005592767 systemd[212495]: Closed D-Bus User Message Bus Socket.
Jan 22 17:17:19 np0005592767 systemd[212495]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:17:19 np0005592767 systemd[212495]: Removed slice User Application Slice.
Jan 22 17:17:19 np0005592767 systemd[212495]: Reached target Shutdown.
Jan 22 17:17:19 np0005592767 systemd[212495]: Finished Exit the Session.
Jan 22 17:17:19 np0005592767 systemd[212495]: Reached target Exit the Session.
Jan 22 17:17:19 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:17:19 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:17:19 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:17:19 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:17:19 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:17:19 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:17:19 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:17:19 np0005592767 podman[212633]: 2026-01-22 22:17:19.150889929 +0000 UTC m=+0.064270011 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.348 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.348 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.349 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.349 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.350 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.366 182627 INFO nova.compute.manager [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Terminating instance#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.386 182627 DEBUG nova.compute.manager [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:17:19 np0005592767 kernel: tap3c448812-44 (unregistering): left promiscuous mode
Jan 22 17:17:19 np0005592767 NetworkManager[54973]: <info>  [1769120239.4050] device (tap3c448812-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:17:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:19Z|00052|binding|INFO|Releasing lport 3c448812-44ed-4bd0-8b7d-3ad276e28d30 from this chassis (sb_readonly=0)
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.408 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:19Z|00053|binding|INFO|Setting lport 3c448812-44ed-4bd0-8b7d-3ad276e28d30 down in Southbound
Jan 22 17:17:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:19Z|00054|binding|INFO|Removing iface tap3c448812-44 ovn-installed in OVS
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.410 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.416 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:f5:6e 10.100.0.10'], port_security=['fa:16:3e:0c:f5:6e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2b3fa714-6109-48db-878b-f5e3d1420dba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31d2ab9-cf9f-454b-9df5-065776293667', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce818300105f44b6abd8aa2b62699bda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d5f0cd7-7223-4821-8ca2-c6a12fc0cebe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be6ad440-24bb-413c-a01d-43bdddaaa797, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3c448812-44ed-4bd0-8b7d-3ad276e28d30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.417 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3c448812-44ed-4bd0-8b7d-3ad276e28d30 in datapath a31d2ab9-cf9f-454b-9df5-065776293667 unbound from our chassis#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.418 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31d2ab9-cf9f-454b-9df5-065776293667, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.419 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2576645d-b8f7-4104-9606-a95a996e22f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.420 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 namespace which is not needed anymore#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 22 17:17:19 np0005592767 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 12.801s CPU time.
Jan 22 17:17:19 np0005592767 systemd-machined[153912]: Machine qemu-4-instance-00000009 terminated.
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [NOTICE]   (212420) : haproxy version is 2.8.14-c23fe91
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [NOTICE]   (212420) : path to executable is /usr/sbin/haproxy
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [WARNING]  (212420) : Exiting Master process...
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [WARNING]  (212420) : Exiting Master process...
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [ALERT]    (212420) : Current worker (212422) exited with code 143 (Terminated)
Jan 22 17:17:19 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212416]: [WARNING]  (212420) : All workers exited. Exiting... (0)
Jan 22 17:17:19 np0005592767 systemd[1]: libpod-b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde.scope: Deactivated successfully.
Jan 22 17:17:19 np0005592767 podman[212680]: 2026-01-22 22:17:19.550444998 +0000 UTC m=+0.047906417 container died b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:17:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde-userdata-shm.mount: Deactivated successfully.
Jan 22 17:17:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay-4084653e40aca75be4161705e52841198e423aa95597859e8026a916104eee0f-merged.mount: Deactivated successfully.
Jan 22 17:17:19 np0005592767 podman[212680]: 2026-01-22 22:17:19.586789429 +0000 UTC m=+0.084250848 container cleanup b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:17:19 np0005592767 systemd[1]: libpod-conmon-b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde.scope: Deactivated successfully.
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.605 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.611 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.643 182627 INFO nova.virt.libvirt.driver [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Instance destroyed successfully.#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.644 182627 DEBUG nova.objects.instance [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'resources' on Instance uuid 2b3fa714-6109-48db-878b-f5e3d1420dba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:19 np0005592767 podman[212711]: 2026-01-22 22:17:19.656694032 +0000 UTC m=+0.045348113 container remove b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.658 182627 DEBUG nova.virt.libvirt.vif [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-373463314',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(27),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-373463314',id=9,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=27,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:16:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-on1n29m8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:16:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=2b3fa714-6109-48db-878b-f5e3d1420dba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.658 182627 DEBUG nova.network.os_vif_util [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "address": "fa:16:3e:0c:f5:6e", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c448812-44", "ovs_interfaceid": "3c448812-44ed-4bd0-8b7d-3ad276e28d30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.659 182627 DEBUG nova.network.os_vif_util [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.659 182627 DEBUG os_vif [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.660 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.661 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c448812-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.662 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.662 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dda3841c-c912-443e-82a8-a633ff346f6b]: (4, ('Thu Jan 22 10:17:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 (b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde)\nb7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde\nThu Jan 22 10:17:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 (b7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde)\nb7b3b2a95e84ddb81bbf99c18e284f9b98d3345e3bba7e7643cb0ec857046dde\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.663 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.664 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[93efe423-6790-45bf-86f9-7e54ed3741bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.665 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa31d2ab9-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.665 182627 INFO os_vif [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:f5:6e,bridge_name='br-int',has_traffic_filtering=True,id=3c448812-44ed-4bd0-8b7d-3ad276e28d30,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c448812-44')#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.666 182627 INFO nova.virt.libvirt.driver [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Deleting instance files /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba_del#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.666 182627 INFO nova.virt.libvirt.driver [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Deletion of /var/lib/nova/instances/2b3fa714-6109-48db-878b-f5e3d1420dba_del complete#033[00m
Jan 22 17:17:19 np0005592767 kernel: tapa31d2ab9-c0: left promiscuous mode
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.668 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.679 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.682 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3d6631-fc21-448d-b5c0-370eec16f121]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.696 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa0c590-dfd0-4109-9dc0-8bcb62897d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.698 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[751ebdb2-113c-4003-8bb3-8c62be04b334]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.714 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[037f5793-a376-4e36-ab02-3f2ed2efb784]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381408, 'reachable_time': 37424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212739, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 systemd[1]: run-netns-ovnmeta\x2da31d2ab9\x2dcf9f\x2d454b\x2d9df5\x2d065776293667.mount: Deactivated successfully.
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.719 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:17:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:19.719 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3c210a1f-c0cc-42ea-8c54-a2051c5f17da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.769 182627 INFO nova.compute.manager [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.769 182627 DEBUG oslo.service.loopingcall [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.769 182627 DEBUG nova.compute.manager [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.770 182627 DEBUG nova.network.neutron [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.808 182627 DEBUG nova.compute.manager [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-unplugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.809 182627 DEBUG oslo_concurrency.lockutils [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.809 182627 DEBUG oslo_concurrency.lockutils [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.809 182627 DEBUG oslo_concurrency.lockutils [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.809 182627 DEBUG nova.compute.manager [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] No waiting events found dispatching network-vif-unplugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:19 np0005592767 nova_compute[182623]: 2026-01-22 22:17:19.809 182627 DEBUG nova.compute.manager [req-92724b77-a87c-46c4-bdbc-b6dfaa061fe4 req-030ccd78-3111-4711-bdb4-ead8c85e7bdb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-unplugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:17:21 np0005592767 podman[212741]: 2026-01-22 22:17:21.148604773 +0000 UTC m=+0.063198530 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 22 17:17:21 np0005592767 podman[212740]: 2026-01-22 22:17:21.174852002 +0000 UTC m=+0.090980183 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.722 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.722 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.722 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "1c2458ea-22d6-480f-ae75-5f050eb08b2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.747 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.748 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.748 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.748 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:17:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:21.851 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.934 182627 WARNING nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.935 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5705MB free_disk=73.38220596313477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.935 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:21 np0005592767 nova_compute[182623]: 2026-01-22 22:17:21.936 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.019 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration for instance 1c2458ea-22d6-480f-ae75-5f050eb08b2b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.038 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.060 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Instance 2b3fa714-6109-48db-878b-f5e3d1420dba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.061 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.061 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.061 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.132 182627 DEBUG nova.compute.provider_tree [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.153 182627 DEBUG nova.scheduler.client.report [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.164 182627 DEBUG nova.compute.manager [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.164 182627 DEBUG oslo_concurrency.lockutils [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.165 182627 DEBUG oslo_concurrency.lockutils [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.165 182627 DEBUG oslo_concurrency.lockutils [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.165 182627 DEBUG nova.compute.manager [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] No waiting events found dispatching network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.166 182627 WARNING nova.compute.manager [req-2229893c-3f2c-4878-9eb6-a59d1fa7643b req-c15510a3-2188-4c59-a41f-081ff571c72a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received unexpected event network-vif-plugged-3c448812-44ed-4bd0-8b7d-3ad276e28d30 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.189 182627 DEBUG nova.compute.resource_tracker [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.189 182627 DEBUG oslo_concurrency.lockutils [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.211 182627 INFO nova.compute.manager [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.300 182627 INFO nova.scheduler.client.report [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Deleted allocation for migration 212b821b-cc9d-4094-bb7b-b23ad6071dc7#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.301 182627 DEBUG nova.virt.libvirt.driver [None req-30d66cbc-1108-47a0-bd2c-03e6e39b2ff5 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.391 182627 DEBUG nova.network.neutron [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.414 182627 INFO nova.compute.manager [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Took 2.64 seconds to deallocate network for instance.#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.494 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.495 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.544 182627 DEBUG nova.compute.provider_tree [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.559 182627 DEBUG nova.scheduler.client.report [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.582 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.610 182627 INFO nova.scheduler.client.report [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Deleted allocations for instance 2b3fa714-6109-48db-878b-f5e3d1420dba#033[00m
Jan 22 17:17:22 np0005592767 nova_compute[182623]: 2026-01-22 22:17:22.717 182627 DEBUG oslo_concurrency.lockutils [None req-3efad6f7-ca0d-4b5e-9e1e-11dabea5f1f6 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "2b3fa714-6109-48db-878b-f5e3d1420dba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.060 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.061 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.073 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.162 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.163 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.169 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.169 182627 INFO nova.compute.claims [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.264 182627 DEBUG nova.compute.manager [req-ebacef8e-fd4f-476d-9741-76a9f6d8763c req-dfbdc3e5-23b5-41dc-a29c-28e63accbae4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Received event network-vif-deleted-3c448812-44ed-4bd0-8b7d-3ad276e28d30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.296 182627 DEBUG nova.compute.provider_tree [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.311 182627 DEBUG nova.scheduler.client.report [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.333 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.334 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.417 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.418 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.434 182627 INFO nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.475 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.506 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.579 182627 DEBUG nova.policy [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a84b864eafb4f74a43b72cf303742cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce818300105f44b6abd8aa2b62699bda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.657 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.658 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.659 182627 INFO nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Creating image(s)#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.659 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.659 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.660 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.671 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.673 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.728 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.728 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.729 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.739 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.828 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.829 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.860 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.861 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.861 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.915 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.916 182627 DEBUG nova.virt.disk.api [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Checking if we can resize image /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.916 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.977 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.977 182627 DEBUG nova.virt.disk.api [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Cannot resize image /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:17:24 np0005592767 nova_compute[182623]: 2026-01-22 22:17:24.978 182627 DEBUG nova.objects.instance [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'migration_context' on Instance uuid 888fa71f-f52d-41c5-8814-4e0b8670b601 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.002 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.002 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.003 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.003 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.004 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.004 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.024 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.025 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.061 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.061 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.072 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.123 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.124 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.125 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.135 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.187 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.187 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.221 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.eph0 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.222 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.223 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.276 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.277 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.278 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Ensure instance console log exists: /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.278 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.278 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.279 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:25 np0005592767 nova_compute[182623]: 2026-01-22 22:17:25.619 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Successfully created port: 421fa598-6e14-4c3a-825c-abd482768676 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.520 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Successfully updated port: 421fa598-6e14-4c3a-825c-abd482768676 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.533 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.534 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquired lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.534 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.684 182627 DEBUG nova.compute.manager [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-changed-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.684 182627 DEBUG nova.compute.manager [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Refreshing instance network info cache due to event network-changed-421fa598-6e14-4c3a-825c-abd482768676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.685 182627 DEBUG oslo_concurrency.lockutils [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:26 np0005592767 nova_compute[182623]: 2026-01-22 22:17:26.780 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:17:27 np0005592767 podman[212816]: 2026-01-22 22:17:27.167937004 +0000 UTC m=+0.072231971 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.531 182627 DEBUG nova.network.neutron [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updating instance_info_cache with network_info: [{"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.561 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Releasing lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.562 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Instance network_info: |[{"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.563 182627 DEBUG oslo_concurrency.lockutils [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.564 182627 DEBUG nova.network.neutron [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Refreshing network info cache for port 421fa598-6e14-4c3a-825c-abd482768676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.569 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Start _get_guest_xml network_info=[{"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [{'device_name': '/dev/vdb', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 1}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.576 182627 WARNING nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.583 182627 DEBUG nova.virt.libvirt.host [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.583 182627 DEBUG nova.virt.libvirt.host [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.589 182627 DEBUG nova.virt.libvirt.host [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.590 182627 DEBUG nova.virt.libvirt.host [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.591 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.591 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:16:47Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='2128535730',id=26,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-1591494949',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.592 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.592 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.592 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.593 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.593 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.593 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.594 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.594 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.594 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.594 182627 DEBUG nova.virt.hardware [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.597 182627 DEBUG nova.virt.libvirt.vif [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:17:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1490710752',id=11,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-ux0s75us',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:17:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=888fa71f-f52d-41c5-8814-4e0b8670b601,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.598 182627 DEBUG nova.network.os_vif_util [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.598 182627 DEBUG nova.network.os_vif_util [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.599 182627 DEBUG nova.objects.instance [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'pci_devices' on Instance uuid 888fa71f-f52d-41c5-8814-4e0b8670b601 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.610 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <uuid>888fa71f-f52d-41c5-8814-4e0b8670b601</uuid>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <name>instance-0000000b</name>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-1490710752</nova:name>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:17:28</nova:creationTime>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:flavor name="tempest-flavor_with_ephemeral_1-1591494949">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:ephemeral>1</nova:ephemeral>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:user uuid="5a84b864eafb4f74a43b72cf303742cc">tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member</nova:user>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:project uuid="ce818300105f44b6abd8aa2b62699bda">tempest-ServersWithSpecificFlavorTestJSON-469507991</nova:project>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        <nova:port uuid="421fa598-6e14-4c3a-825c-abd482768676">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="serial">888fa71f-f52d-41c5-8814-4e0b8670b601</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="uuid">888fa71f-f52d-41c5-8814-4e0b8670b601</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.eph0"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <target dev="vdb" bus="virtio"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.config"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:e7:ae:b5"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <target dev="tap421fa598-6e"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/console.log" append="off"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:17:28 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:17:28 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:17:28 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:17:28 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.611 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Preparing to wait for external event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.612 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.612 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.612 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.613 182627 DEBUG nova.virt.libvirt.vif [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:17:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1490710752',id=11,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-ux0s75us',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:17:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=888fa71f-f52d-41c5-8814-4e0b8670b601,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.613 182627 DEBUG nova.network.os_vif_util [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.614 182627 DEBUG nova.network.os_vif_util [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.615 182627 DEBUG os_vif [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.615 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.616 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.616 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.618 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.618 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap421fa598-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.619 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap421fa598-6e, col_values=(('external_ids', {'iface-id': '421fa598-6e14-4c3a-825c-abd482768676', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:ae:b5', 'vm-uuid': '888fa71f-f52d-41c5-8814-4e0b8670b601'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:28 np0005592767 NetworkManager[54973]: <info>  [1769120248.6211] manager: (tap421fa598-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.623 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.626 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.626 182627 INFO os_vif [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e')#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.676 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.676 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.676 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.676 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] No VIF found with MAC fa:16:3e:e7:ae:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:17:28 np0005592767 nova_compute[182623]: 2026-01-22 22:17:28.677 182627 INFO nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Using config drive#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.080 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120234.0782814, 1c2458ea-22d6-480f-ae75-5f050eb08b2b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.080 182627 INFO nova.compute.manager [-] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.099 182627 DEBUG nova.compute.manager [None req-8f6c3ebb-4d7c-4bbb-b3a7-13e5c3693d10 - - - - - -] [instance: 1c2458ea-22d6-480f-ae75-5f050eb08b2b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.167 182627 INFO nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Creating config drive at /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.config#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.179 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb52m01mv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.315 182627 DEBUG oslo_concurrency.processutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb52m01mv" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:29 np0005592767 kernel: tap421fa598-6e: entered promiscuous mode
Jan 22 17:17:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:29Z|00055|binding|INFO|Claiming lport 421fa598-6e14-4c3a-825c-abd482768676 for this chassis.
Jan 22 17:17:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:29Z|00056|binding|INFO|421fa598-6e14-4c3a-825c-abd482768676: Claiming fa:16:3e:e7:ae:b5 10.100.0.6
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.3827] manager: (tap421fa598-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.381 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.391 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:ae:b5 10.100.0.6'], port_security=['fa:16:3e:e7:ae:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '888fa71f-f52d-41c5-8814-4e0b8670b601', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31d2ab9-cf9f-454b-9df5-065776293667', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce818300105f44b6abd8aa2b62699bda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d5f0cd7-7223-4821-8ca2-c6a12fc0cebe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be6ad440-24bb-413c-a01d-43bdddaaa797, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=421fa598-6e14-4c3a-825c-abd482768676) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.393 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 421fa598-6e14-4c3a-825c-abd482768676 in datapath a31d2ab9-cf9f-454b-9df5-065776293667 bound to our chassis#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.393 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:29Z|00057|binding|INFO|Setting lport 421fa598-6e14-4c3a-825c-abd482768676 ovn-installed in OVS
Jan 22 17:17:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:29Z|00058|binding|INFO|Setting lport 421fa598-6e14-4c3a-825c-abd482768676 up in Southbound
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.394 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.396 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a31d2ab9-cf9f-454b-9df5-065776293667#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.398 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.408 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c936cddf-7a7c-4802-914b-a923058e119f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.409 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa31d2ab9-c1 in ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.411 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa31d2ab9-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.411 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[92cec640-95b5-419c-b097-b74b210d4bc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.412 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a17f50c1-d1e1-48c6-8d50-6b90d8605755]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 systemd-machined[153912]: New machine qemu-5-instance-0000000b.
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.422 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae375d5-931b-4346-8c59-5324443fb891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 systemd[1]: Started Virtual Machine qemu-5-instance-0000000b.
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.436 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[07c29f92-2ebd-402a-a869-648c45a6bc39]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 systemd-udevd[212860]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.4545] device (tap421fa598-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.4549] device (tap421fa598-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.473 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[99b2e95b-a1fe-4941-9b0d-fe73114b42a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.4829] manager: (tapa31d2ab9-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.482 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fd71e679-b4a8-4ea4-82c6-8d416201254c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.507 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.528 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[766b2ea5-31b3-4da0-b472-6582db593976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.531 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4d83cbc9-7286-4280-ae85-eb339d927a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.5560] device (tapa31d2ab9-c0): carrier: link connected
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.562 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[93142ad8-84b1-4d12-b37f-5adaa196d3d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.584 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[debf2735-145d-41dd-9de4-3788ee70e5c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa31d2ab9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:73:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384613, 'reachable_time': 23347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212890, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.603 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a9646fba-aa6f-4610-a2b5-b9cca87a05dd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:7381'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384613, 'tstamp': 384613}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212896, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.613 182627 DEBUG nova.compute.manager [req-19e48dde-9733-473e-88a1-574ea8fb3a26 req-3b5a5072-561d-4323-ae79-cee7f431c2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.614 182627 DEBUG oslo_concurrency.lockutils [req-19e48dde-9733-473e-88a1-574ea8fb3a26 req-3b5a5072-561d-4323-ae79-cee7f431c2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.614 182627 DEBUG oslo_concurrency.lockutils [req-19e48dde-9733-473e-88a1-574ea8fb3a26 req-3b5a5072-561d-4323-ae79-cee7f431c2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.615 182627 DEBUG oslo_concurrency.lockutils [req-19e48dde-9733-473e-88a1-574ea8fb3a26 req-3b5a5072-561d-4323-ae79-cee7f431c2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.615 182627 DEBUG nova.compute.manager [req-19e48dde-9733-473e-88a1-574ea8fb3a26 req-3b5a5072-561d-4323-ae79-cee7f431c2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Processing event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.621 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2334a820-825f-4fcd-8ddc-89e982372c79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa31d2ab9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:73:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384613, 'reachable_time': 23347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212898, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.656 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc221d47-efe6-4102-8bf7-ef104ddbe931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.669 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120249.6694143, 888fa71f-f52d-41c5-8814-4e0b8670b601 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.670 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] VM Started (Lifecycle Event)#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.672 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.675 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.678 182627 INFO nova.virt.libvirt.driver [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Instance spawned successfully.#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.679 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.698 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.703 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.707 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.707 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.707 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.708 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.708 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.709 182627 DEBUG nova.virt.libvirt.driver [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.712 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7b566c-8878-4ebf-bd20-7964384418ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.714 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa31d2ab9-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.714 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.715 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa31d2ab9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:29 np0005592767 NetworkManager[54973]: <info>  [1769120249.7174] manager: (tapa31d2ab9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 22 17:17:29 np0005592767 kernel: tapa31d2ab9-c0: entered promiscuous mode
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.716 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.720 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa31d2ab9-c0, col_values=(('external_ids', {'iface-id': '06d3bb0f-011c-43a0-a675-327a84bfa758'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:29Z|00059|binding|INFO|Releasing lport 06d3bb0f-011c-43a0-a675-327a84bfa758 from this chassis (sb_readonly=0)
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.721 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.723 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.724 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[174d2cdd-4a93-4024-a9d0-e61aac4500c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.724 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-a31d2ab9-cf9f-454b-9df5-065776293667
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/a31d2ab9-cf9f-454b-9df5-065776293667.pid.haproxy
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID a31d2ab9-cf9f-454b-9df5-065776293667
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:17:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:29.725 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'env', 'PROCESS_TAG=haproxy-a31d2ab9-cf9f-454b-9df5-065776293667', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a31d2ab9-cf9f-454b-9df5-065776293667.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.763 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.764 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120249.6703124, 888fa71f-f52d-41c5-8814-4e0b8670b601 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.764 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.808 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.811 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120249.6749773, 888fa71f-f52d-41c5-8814-4e0b8670b601 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.811 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.827 182627 INFO nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Took 5.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.827 182627 DEBUG nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.837 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.839 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.879 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.945 182627 INFO nova.compute.manager [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Took 5.82 seconds to build instance.#033[00m
Jan 22 17:17:29 np0005592767 nova_compute[182623]: 2026-01-22 22:17:29.987 182627 DEBUG oslo_concurrency.lockutils [None req-3c0d23b5-c48e-4f21-a7f7-ad5930c7ddf8 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:30 np0005592767 podman[212931]: 2026-01-22 22:17:30.142381466 +0000 UTC m=+0.058773912 container create 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:17:30 np0005592767 systemd[1]: Started libpod-conmon-38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d.scope.
Jan 22 17:17:30 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:17:30 np0005592767 podman[212931]: 2026-01-22 22:17:30.112770749 +0000 UTC m=+0.029163225 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:30 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24e27fc7a191b71589a2d48aa5ab095705fc6e4126c44cce889b02127eaff79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:30 np0005592767 podman[212931]: 2026-01-22 22:17:30.222066361 +0000 UTC m=+0.138458807 container init 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:17:30 np0005592767 podman[212931]: 2026-01-22 22:17:30.228458546 +0000 UTC m=+0.144850972 container start 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:30 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [NOTICE]   (212956) : New worker (212964) forked
Jan 22 17:17:30 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [NOTICE]   (212956) : Loading success.
Jan 22 17:17:30 np0005592767 podman[212949]: 2026-01-22 22:17:30.286972449 +0000 UTC m=+0.064183168 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:17:30 np0005592767 nova_compute[182623]: 2026-01-22 22:17:30.296 182627 DEBUG nova.network.neutron [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updated VIF entry in instance network info cache for port 421fa598-6e14-4c3a-825c-abd482768676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:17:30 np0005592767 nova_compute[182623]: 2026-01-22 22:17:30.297 182627 DEBUG nova.network.neutron [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updating instance_info_cache with network_info: [{"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:30 np0005592767 nova_compute[182623]: 2026-01-22 22:17:30.315 182627 DEBUG oslo_concurrency.lockutils [req-63a373cc-323f-4eba-8b1f-42a3e5482bb0 req-2a8b5236-e539-46ab-91be-0f6e343a5c2a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.685 182627 DEBUG nova.compute.manager [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.686 182627 DEBUG oslo_concurrency.lockutils [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.687 182627 DEBUG oslo_concurrency.lockutils [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.687 182627 DEBUG oslo_concurrency.lockutils [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.687 182627 DEBUG nova.compute.manager [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] No waiting events found dispatching network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:31 np0005592767 nova_compute[182623]: 2026-01-22 22:17:31.688 182627 WARNING nova.compute.manager [req-29bf58e0-0068-40a6-98b2-a9e1cf08367e req-6d8ed6e4-6314-4c96-ac7a-dc9da7504b5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received unexpected event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.274 182627 DEBUG nova.compute.manager [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-changed-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.275 182627 DEBUG nova.compute.manager [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Refreshing instance network info cache due to event network-changed-421fa598-6e14-4c3a-825c-abd482768676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.275 182627 DEBUG oslo_concurrency.lockutils [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.275 182627 DEBUG oslo_concurrency.lockutils [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.276 182627 DEBUG nova.network.neutron [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Refreshing network info cache for port 421fa598-6e14-4c3a-825c-abd482768676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:17:33 np0005592767 nova_compute[182623]: 2026-01-22 22:17:33.624 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.641 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120239.6406348, 2b3fa714-6109-48db-878b-f5e3d1420dba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.642 182627 INFO nova.compute.manager [-] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.658 182627 DEBUG nova.compute.manager [None req-fdd9d9cf-b954-4571-b02f-0f450d7cd664 - - - - - -] [instance: 2b3fa714-6109-48db-878b-f5e3d1420dba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.682 182627 DEBUG nova.network.neutron [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updated VIF entry in instance network info cache for port 421fa598-6e14-4c3a-825c-abd482768676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.683 182627 DEBUG nova.network.neutron [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updating instance_info_cache with network_info: [{"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:34 np0005592767 nova_compute[182623]: 2026-01-22 22:17:34.699 182627 DEBUG oslo_concurrency.lockutils [req-bb220719-933e-4fbe-a6e6-8e738716afa8 req-bb1fc780-0b82-4e48-a395-ceaa449c16da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-888fa71f-f52d-41c5-8814-4e0b8670b601" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:37 np0005592767 podman[212986]: 2026-01-22 22:17:37.143094889 +0000 UTC m=+0.060313816 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.632 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.719 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.720 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.751 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.839 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.840 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.845 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.846 182627 INFO nova.compute.claims [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:17:38 np0005592767 nova_compute[182623]: 2026-01-22 22:17:38.988 182627 DEBUG nova.compute.provider_tree [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.008 182627 DEBUG nova.scheduler.client.report [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.029 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.030 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.082 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.083 182627 DEBUG nova.network.neutron [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.098 182627 INFO nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.116 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.244 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.246 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.246 182627 INFO nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Creating image(s)#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.247 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.247 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.248 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.265 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.326 182627 DEBUG nova.network.neutron [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.326 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.350 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.351 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.351 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.363 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.418 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.420 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.453 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.454 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.454 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.512 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.549 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.550 182627 DEBUG nova.virt.disk.api [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Checking if we can resize image /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.550 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.615 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.616 182627 DEBUG nova.virt.disk.api [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Cannot resize image /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.616 182627 DEBUG nova.objects.instance [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lazy-loading 'migration_context' on Instance uuid 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.630 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.631 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Ensure instance console log exists: /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.631 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.631 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.632 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.633 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.637 182627 WARNING nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.641 182627 DEBUG nova.virt.libvirt.host [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.641 182627 DEBUG nova.virt.libvirt.host [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.645 182627 DEBUG nova.virt.libvirt.host [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.645 182627 DEBUG nova.virt.libvirt.host [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.646 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.646 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.647 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.648 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.648 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.648 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.648 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.649 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.649 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.649 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.649 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.649 182627 DEBUG nova.virt.hardware [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.653 182627 DEBUG nova.objects.instance [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lazy-loading 'pci_devices' on Instance uuid 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.671 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <uuid>0fb222ec-1e7d-4b8d-821b-aded7d7f5678</uuid>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <name>instance-0000000f</name>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-369124310</nova:name>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:17:39</nova:creationTime>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:user uuid="3720ae93dc774d46a0b3cc6c5ba2421b">tempest-ServersAdminNegativeTestJSON-1610965729-project-member</nova:user>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:        <nova:project uuid="fa9926c22bea49b3aca22e946dca07db">tempest-ServersAdminNegativeTestJSON-1610965729</nova:project>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="serial">0fb222ec-1e7d-4b8d-821b-aded7d7f5678</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="uuid">0fb222ec-1e7d-4b8d-821b-aded7d7f5678</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.config"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/console.log" append="off"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:17:39 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:17:39 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:17:39 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:17:39 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.722 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.723 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:17:39 np0005592767 nova_compute[182623]: 2026-01-22 22:17:39.724 182627 INFO nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Using config drive#033[00m
Jan 22 17:17:40 np0005592767 nova_compute[182623]: 2026-01-22 22:17:40.650 182627 INFO nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Creating config drive at /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.config#033[00m
Jan 22 17:17:40 np0005592767 nova_compute[182623]: 2026-01-22 22:17:40.656 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4xybh53x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:40 np0005592767 nova_compute[182623]: 2026-01-22 22:17:40.789 182627 DEBUG oslo_concurrency.processutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4xybh53x" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:40 np0005592767 systemd-machined[153912]: New machine qemu-6-instance-0000000f.
Jan 22 17:17:40 np0005592767 systemd[1]: Started Virtual Machine qemu-6-instance-0000000f.
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.347 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120261.3474064, 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.349 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.352 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.352 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.355 182627 INFO nova.virt.libvirt.driver [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance spawned successfully.#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.356 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.374 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.380 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.382 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.383 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.383 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.384 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.384 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.384 182627 DEBUG nova.virt.libvirt.driver [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.405 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.405 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120261.3483515, 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.405 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] VM Started (Lifecycle Event)#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.427 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.429 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.447 182627 INFO nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Took 2.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.448 182627 DEBUG nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.453 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.514 182627 INFO nova.compute.manager [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Took 2.71 seconds to build instance.#033[00m
Jan 22 17:17:41 np0005592767 nova_compute[182623]: 2026-01-22 22:17:41.528 182627 DEBUG oslo_concurrency.lockutils [None req-e699e6ae-326d-4b04-bbcb-c669d1c38d03 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:42 np0005592767 nova_compute[182623]: 2026-01-22 22:17:42.987 182627 DEBUG nova.objects.instance [None req-5272c3c0-065d-427d-918f-4c8fc18574d7 09d140c294e94f359ff1d86f5b1a829b dd5671a8ef104519952b3efffba13ce1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:42Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:ae:b5 10.100.0.6
Jan 22 17:17:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:42Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:ae:b5 10.100.0.6
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.023 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120263.023229, 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.023 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.052 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.055 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.081 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:17:43 np0005592767 nova_compute[182623]: 2026-01-22 22:17:43.636 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:43 np0005592767 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 22 17:17:43 np0005592767 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Consumed 2.253s CPU time.
Jan 22 17:17:43 np0005592767 systemd-machined[153912]: Machine qemu-6-instance-0000000f terminated.
Jan 22 17:17:44 np0005592767 nova_compute[182623]: 2026-01-22 22:17:44.003 182627 DEBUG nova.compute.manager [None req-5272c3c0-065d-427d-918f-4c8fc18574d7 09d140c294e94f359ff1d86f5b1a829b dd5671a8ef104519952b3efffba13ce1 - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:44 np0005592767 nova_compute[182623]: 2026-01-22 22:17:44.515 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:46 np0005592767 nova_compute[182623]: 2026-01-22 22:17:46.085 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating tmpfile /var/lib/nova/instances/tmphpugygql to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 22 17:17:46 np0005592767 nova_compute[182623]: 2026-01-22 22:17:46.086 182627 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 22 17:17:48 np0005592767 nova_compute[182623]: 2026-01-22 22:17:48.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.247 182627 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.281 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.281 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.282 182627 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.517 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.588 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.588 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.589 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.589 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.589 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.601 182627 INFO nova.compute.manager [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Terminating instance#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.612 182627 DEBUG nova.compute.manager [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:17:49 np0005592767 kernel: tap421fa598-6e (unregistering): left promiscuous mode
Jan 22 17:17:49 np0005592767 NetworkManager[54973]: <info>  [1769120269.6565] device (tap421fa598-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:17:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:49Z|00060|binding|INFO|Releasing lport 421fa598-6e14-4c3a-825c-abd482768676 from this chassis (sb_readonly=0)
Jan 22 17:17:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:49Z|00061|binding|INFO|Setting lport 421fa598-6e14-4c3a-825c-abd482768676 down in Southbound
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.664 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:49Z|00062|binding|INFO|Removing iface tap421fa598-6e ovn-installed in OVS
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.666 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.681 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.683 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:ae:b5 10.100.0.6'], port_security=['fa:16:3e:e7:ae:b5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '888fa71f-f52d-41c5-8814-4e0b8670b601', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31d2ab9-cf9f-454b-9df5-065776293667', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce818300105f44b6abd8aa2b62699bda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d5f0cd7-7223-4821-8ca2-c6a12fc0cebe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be6ad440-24bb-413c-a01d-43bdddaaa797, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=421fa598-6e14-4c3a-825c-abd482768676) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.685 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 421fa598-6e14-4c3a-825c-abd482768676 in datapath a31d2ab9-cf9f-454b-9df5-065776293667 unbound from our chassis#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.687 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31d2ab9-cf9f-454b-9df5-065776293667, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.689 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9c7777-d431-428f-ad2c-cfc13d45ee09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.690 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 namespace which is not needed anymore#033[00m
Jan 22 17:17:49 np0005592767 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 22 17:17:49 np0005592767 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Consumed 13.459s CPU time.
Jan 22 17:17:49 np0005592767 systemd-machined[153912]: Machine qemu-5-instance-0000000b terminated.
Jan 22 17:17:49 np0005592767 podman[213085]: 2026-01-22 22:17:49.751836474 +0000 UTC m=+0.068278476 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 17:17:49 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [NOTICE]   (212956) : haproxy version is 2.8.14-c23fe91
Jan 22 17:17:49 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [NOTICE]   (212956) : path to executable is /usr/sbin/haproxy
Jan 22 17:17:49 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [WARNING]  (212956) : Exiting Master process...
Jan 22 17:17:49 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [ALERT]    (212956) : Current worker (212964) exited with code 143 (Terminated)
Jan 22 17:17:49 np0005592767 neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667[212946]: [WARNING]  (212956) : All workers exited. Exiting... (0)
Jan 22 17:17:49 np0005592767 systemd[1]: libpod-38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d.scope: Deactivated successfully.
Jan 22 17:17:49 np0005592767 conmon[212946]: conmon 38f2822b957d8bf092fd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d.scope/container/memory.events
Jan 22 17:17:49 np0005592767 podman[213129]: 2026-01-22 22:17:49.825881346 +0000 UTC m=+0.045286151 container died 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:17:49 np0005592767 kernel: tap421fa598-6e: entered promiscuous mode
Jan 22 17:17:49 np0005592767 kernel: tap421fa598-6e (unregistering): left promiscuous mode
Jan 22 17:17:49 np0005592767 NetworkManager[54973]: <info>  [1769120269.8367] manager: (tap421fa598-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:17:49 np0005592767 systemd[1]: var-lib-containers-storage-overlay-c24e27fc7a191b71589a2d48aa5ab095705fc6e4126c44cce889b02127eaff79-merged.mount: Deactivated successfully.
Jan 22 17:17:49 np0005592767 podman[213129]: 2026-01-22 22:17:49.871375572 +0000 UTC m=+0.090780367 container cleanup 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:49 np0005592767 systemd[1]: libpod-conmon-38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d.scope: Deactivated successfully.
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.897 182627 INFO nova.virt.libvirt.driver [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Instance destroyed successfully.#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.897 182627 DEBUG nova.objects.instance [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lazy-loading 'resources' on Instance uuid 888fa71f-f52d-41c5-8814-4e0b8670b601 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.919 182627 DEBUG nova.virt.libvirt.vif [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:17:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1490710752',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1490710752',id=11,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELnKbntEh9G5MZ12eWzXWY70wmzm3VzTvPu77lxVBnBlu+iJcGSuBYidHHZBA8p9HhdKhj62AfoRNg23cQd+DsuFFmxgPxFDa0kr+Edd2cHMgT/i1pJKJXbktLYbsAIGQ==',key_name='tempest-keypair-609803177',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:17:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce818300105f44b6abd8aa2b62699bda',ramdisk_id='',reservation_id='r-ux0s75us',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-469507991',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-469507991-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:17:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a84b864eafb4f74a43b72cf303742cc',uuid=888fa71f-f52d-41c5-8814-4e0b8670b601,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.919 182627 DEBUG nova.network.os_vif_util [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converting VIF {"id": "421fa598-6e14-4c3a-825c-abd482768676", "address": "fa:16:3e:e7:ae:b5", "network": {"id": "a31d2ab9-cf9f-454b-9df5-065776293667", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-456294171-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce818300105f44b6abd8aa2b62699bda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap421fa598-6e", "ovs_interfaceid": "421fa598-6e14-4c3a-825c-abd482768676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.920 182627 DEBUG nova.network.os_vif_util [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.921 182627 DEBUG os_vif [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.924 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap421fa598-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.925 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.929 182627 INFO os_vif [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:b5,bridge_name='br-int',has_traffic_filtering=True,id=421fa598-6e14-4c3a-825c-abd482768676,network=Network(a31d2ab9-cf9f-454b-9df5-065776293667),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap421fa598-6e')#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.929 182627 INFO nova.virt.libvirt.driver [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Deleting instance files /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601_del#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.930 182627 INFO nova.virt.libvirt.driver [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Deletion of /var/lib/nova/instances/888fa71f-f52d-41c5-8814-4e0b8670b601_del complete#033[00m
Jan 22 17:17:49 np0005592767 podman[213174]: 2026-01-22 22:17:49.945103575 +0000 UTC m=+0.044377484 container remove 38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.951 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[954608a0-bf09-494b-a58c-d2ef3b5a5634]: (4, ('Thu Jan 22 10:17:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 (38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d)\n38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d\nThu Jan 22 10:17:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 (38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d)\n38f2822b957d8bf092fd71c1f202da24936448801045ee39515d085f41e5684d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.953 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d445cf3c-1f06-4951-a47a-5080fc8eba21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.954 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa31d2ab9-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.956 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 kernel: tapa31d2ab9-c0: left promiscuous mode
Jan 22 17:17:49 np0005592767 nova_compute[182623]: 2026-01-22 22:17:49.967 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.970 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[df63de08-7539-4eed-8ed4-8aad1a0bcd0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.986 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7b9a41-a0a0-44b4-8deb-4038bd17c22b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:49.988 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a435d2d2-ccc5-4f57-a913-7305b06edc30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:50.001 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e4155062-b88d-4e22-a38c-739befaff86f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384604, 'reachable_time': 20655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213190, 'error': None, 'target': 'ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:50.004 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a31d2ab9-cf9f-454b-9df5-065776293667 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:17:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:50.004 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d2104306-ac32-4a63-a750-8524875a0f6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:50 np0005592767 systemd[1]: run-netns-ovnmeta\x2da31d2ab9\x2dcf9f\x2d454b\x2d9df5\x2d065776293667.mount: Deactivated successfully.
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.065 182627 INFO nova.compute.manager [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.066 182627 DEBUG oslo.service.loopingcall [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.066 182627 DEBUG nova.compute.manager [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.066 182627 DEBUG nova.network.neutron [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.235 182627 DEBUG nova.compute.manager [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-unplugged-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.236 182627 DEBUG oslo_concurrency.lockutils [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.236 182627 DEBUG oslo_concurrency.lockutils [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.236 182627 DEBUG oslo_concurrency.lockutils [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.236 182627 DEBUG nova.compute.manager [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] No waiting events found dispatching network-vif-unplugged-421fa598-6e14-4c3a-825c-abd482768676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:50 np0005592767 nova_compute[182623]: 2026-01-22 22:17:50.237 182627 DEBUG nova.compute.manager [req-21e4a058-630f-412e-a495-714ccf482fa9 req-731db8bf-a49f-464c-86c9-cc112e638688 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-unplugged-421fa598-6e14-4c3a-825c-abd482768676 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.092 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.092 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.093 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.093 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.093 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.104 182627 INFO nova.compute.manager [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Terminating instance#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.114 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "refresh_cache-0fb222ec-1e7d-4b8d-821b-aded7d7f5678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.115 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquired lock "refresh_cache-0fb222ec-1e7d-4b8d-821b-aded7d7f5678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.115 182627 DEBUG nova.network.neutron [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.393 182627 DEBUG nova.network.neutron [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.532 182627 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.553 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.566 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.566 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating instance directory: /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.567 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Creating disk.info with the contents: {'/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk': 'qcow2', '/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.567 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.568 182627 DEBUG nova.objects.instance [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.597 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.651 182627 DEBUG nova.network.neutron [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.671 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Releasing lock "refresh_cache-0fb222ec-1e7d-4b8d-821b-aded7d7f5678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.672 182627 DEBUG nova.compute.manager [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.675 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.677 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.678 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.690 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.711 182627 INFO nova.virt.libvirt.driver [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance destroyed successfully.#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.712 182627 DEBUG nova.objects.instance [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lazy-loading 'resources' on Instance uuid 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.730 182627 INFO nova.virt.libvirt.driver [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Deleting instance files /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678_del#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.730 182627 INFO nova.virt.libvirt.driver [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Deletion of /var/lib/nova/instances/0fb222ec-1e7d-4b8d-821b-aded7d7f5678_del complete#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.749 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.749 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.782 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.783 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.784 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.833 182627 INFO nova.compute.manager [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Took 0.16 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.834 182627 DEBUG oslo.service.loopingcall [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.835 182627 DEBUG nova.compute.manager [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.836 182627 DEBUG nova.network.neutron [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.841 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.842 182627 DEBUG nova.virt.disk.api [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Checking if we can resize image /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.843 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.892 182627 DEBUG nova.network.neutron [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.909 182627 INFO nova.compute.manager [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Took 1.84 seconds to deallocate network for instance.#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.923 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.924 182627 DEBUG nova.virt.disk.api [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Cannot resize image /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.925 182627 DEBUG nova.objects.instance [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lazy-loading 'migration_context' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.964 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.985 182627 DEBUG nova.network.neutron [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.988 182627 DEBUG nova.compute.manager [req-bf22f1a1-b636-4328-b5a9-8a35aeaceac4 req-982aa29a-af3b-4c22-b71d-ae6b4af1c11a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-deleted-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.991 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.991 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config to /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:17:51 np0005592767 nova_compute[182623]: 2026-01-22 22:17:51.992 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.010 182627 DEBUG nova.network.neutron [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.013 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.013 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.028 182627 INFO nova.compute.manager [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Took 0.19 seconds to deallocate network for instance.#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.095 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.107 182627 DEBUG nova.compute.provider_tree [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.122 182627 DEBUG nova.scheduler.client.report [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.141 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.143 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:52 np0005592767 podman[213212]: 2026-01-22 22:17:52.144237197 +0000 UTC m=+0.053477318 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:17:52 np0005592767 podman[213211]: 2026-01-22 22:17:52.180161156 +0000 UTC m=+0.091320143 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.185 182627 INFO nova.scheduler.client.report [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Deleted allocations for instance 888fa71f-f52d-41c5-8814-4e0b8670b601#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.223 182627 DEBUG nova.compute.provider_tree [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.242 182627 DEBUG nova.scheduler.client.report [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.267 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.281 182627 DEBUG oslo_concurrency.lockutils [None req-b9775a66-2132-481c-b6cb-c9237a4710a7 5a84b864eafb4f74a43b72cf303742cc ce818300105f44b6abd8aa2b62699bda - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.323 182627 INFO nova.scheduler.client.report [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Deleted allocations for instance 0fb222ec-1e7d-4b8d-821b-aded7d7f5678#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.407 182627 DEBUG oslo_concurrency.lockutils [None req-adf248a5-7de7-4462-92e0-9af228412f56 3720ae93dc774d46a0b3cc6c5ba2421b fa9926c22bea49b3aca22e946dca07db - - default default] Lock "0fb222ec-1e7d-4b8d-821b-aded7d7f5678" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.432 182627 DEBUG nova.compute.manager [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.432 182627 DEBUG oslo_concurrency.lockutils [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.433 182627 DEBUG oslo_concurrency.lockutils [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.433 182627 DEBUG oslo_concurrency.lockutils [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "888fa71f-f52d-41c5-8814-4e0b8670b601-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.433 182627 DEBUG nova.compute.manager [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] No waiting events found dispatching network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.433 182627 WARNING nova.compute.manager [req-f97457c6-bc90-42c0-a15b-50dee2d89d78 req-3b91e938-d902-4f08-98d8-8835c8acd79f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Received unexpected event network-vif-plugged-421fa598-6e14-4c3a-825c-abd482768676 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.528 182627 DEBUG oslo_concurrency.processutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477/disk.config /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.529 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.530 182627 DEBUG nova.virt.libvirt.vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:17:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:17:38Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.531 182627 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.532 182627 DEBUG nova.network.os_vif_util [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.532 182627 DEBUG os_vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.533 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.534 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.534 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.536 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.536 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d0bf445-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.537 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d0bf445-f7, col_values=(('external_ids', {'iface-id': '1d0bf445-f745-430d-9927-a3d8cdc9b6fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:be:3d', 'vm-uuid': 'eb864a01-1633-42f3-ac5f-4d664cc5d477'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.538 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:52 np0005592767 NetworkManager[54973]: <info>  [1769120272.5390] manager: (tap1d0bf445-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.541 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.543 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.544 182627 INFO os_vif [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7')#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.544 182627 DEBUG nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 22 17:17:52 np0005592767 nova_compute[182623]: 2026-01-22 22:17:52.545 182627 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 22 17:17:54 np0005592767 nova_compute[182623]: 2026-01-22 22:17:54.519 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:54 np0005592767 nova_compute[182623]: 2026-01-22 22:17:54.673 182627 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 22 17:17:54 np0005592767 nova_compute[182623]: 2026-01-22 22:17:54.686 182627 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphpugygql',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb864a01-1633-42f3-ac5f-4d664cc5d477',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 22 17:17:54 np0005592767 kernel: tap1d0bf445-f7: entered promiscuous mode
Jan 22 17:17:54 np0005592767 NetworkManager[54973]: <info>  [1769120274.9669] manager: (tap1d0bf445-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Jan 22 17:17:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:54Z|00063|binding|INFO|Claiming lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc for this additional chassis.
Jan 22 17:17:54 np0005592767 nova_compute[182623]: 2026-01-22 22:17:54.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:54Z|00064|binding|INFO|1d0bf445-f745-430d-9927-a3d8cdc9b6fc: Claiming fa:16:3e:83:be:3d 10.100.0.6
Jan 22 17:17:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:54Z|00065|binding|INFO|Claiming lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d for this additional chassis.
Jan 22 17:17:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:54Z|00066|binding|INFO|9927fe61-75e1-4c06-8f4c-ccc8597a433d: Claiming fa:16:3e:1b:78:60 19.80.0.76
Jan 22 17:17:54 np0005592767 systemd-udevd[213274]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:55 np0005592767 NetworkManager[54973]: <info>  [1769120275.0065] device (tap1d0bf445-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:17:55 np0005592767 NetworkManager[54973]: <info>  [1769120275.0070] device (tap1d0bf445-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.014 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:55Z|00067|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc ovn-installed in OVS
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.017 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:55 np0005592767 systemd-machined[153912]: New machine qemu-7-instance-0000000e.
Jan 22 17:17:55 np0005592767 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.865 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120275.8643916, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.865 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Started (Lifecycle Event)#033[00m
Jan 22 17:17:55 np0005592767 nova_compute[182623]: 2026-01-22 22:17:55.885 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:56 np0005592767 nova_compute[182623]: 2026-01-22 22:17:56.873 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120276.8734298, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:56 np0005592767 nova_compute[182623]: 2026-01-22 22:17:56.874 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:17:56 np0005592767 nova_compute[182623]: 2026-01-22 22:17:56.895 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:56 np0005592767 nova_compute[182623]: 2026-01-22 22:17:56.899 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:17:56 np0005592767 nova_compute[182623]: 2026-01-22 22:17:56.920 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 22 17:17:57 np0005592767 nova_compute[182623]: 2026-01-22 22:17:57.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:58 np0005592767 podman[213305]: 2026-01-22 22:17:58.15839336 +0000 UTC m=+0.062685694 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:17:58 np0005592767 nova_compute[182623]: 2026-01-22 22:17:58.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:58 np0005592767 nova_compute[182623]: 2026-01-22 22:17:58.563 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00068|binding|INFO|Claiming lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc for this chassis.
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00069|binding|INFO|1d0bf445-f745-430d-9927-a3d8cdc9b6fc: Claiming fa:16:3e:83:be:3d 10.100.0.6
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00070|binding|INFO|Claiming lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d for this chassis.
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00071|binding|INFO|9927fe61-75e1-4c06-8f4c-ccc8597a433d: Claiming fa:16:3e:1b:78:60 19.80.0.76
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00072|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc up in Southbound
Jan 22 17:17:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:58Z|00073|binding|INFO|Setting lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d up in Southbound
Jan 22 17:17:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:58.977 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:78:60 19.80.0.76'], port_security=['fa:16:3e:1b:78:60 19.80.0.76'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['1d0bf445-f745-430d-9927-a3d8cdc9b6fc'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2040814554', 'neutron:cidrs': '19.80.0.76/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2040814554', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52c62c6f-61b3-4b60-8745-b12d4e251f43, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9927fe61-75e1-4c06-8f4c-ccc8597a433d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:58.981 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:be:3d 10.100.0.6'], port_security=['fa:16:3e:83:be:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-835502342', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb864a01-1633-42f3-ac5f-4d664cc5d477', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-835502342', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=1d0bf445-f745-430d-9927-a3d8cdc9b6fc) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:17:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:58.983 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 9927fe61-75e1-4c06-8f4c-ccc8597a433d in datapath 1b7cb047-7415-4b9a-be62-075d33a42dfe bound to our chassis#033[00m
Jan 22 17:17:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:58.986 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b7cb047-7415-4b9a-be62-075d33a42dfe#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.004 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120264.0031383, 0fb222ec-1e7d-4b8d-821b-aded7d7f5678 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.004 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3addb0fb-772a-4bb0-a0c8-8ed5c3472650]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.005 182627 INFO nova.compute.manager [-] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.005 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b7cb047-71 in ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.008 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b7cb047-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.008 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f28cd455-1538-4b40-8500-62720349cff4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.009 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5fc1a1-b7e8-4093-b959-22a48df93a95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.021 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[764a0e36-34ce-47ba-80cc-7eec0764c2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.024 182627 DEBUG nova.compute.manager [None req-f4ee786a-16a9-46b8-9b20-e7e08d2504d4 - - - - - -] [instance: 0fb222ec-1e7d-4b8d-821b-aded7d7f5678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.036 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[76c4b762-3367-4589-aa85-3d0e9087008f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.066 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[31e7e066-2e60-48fd-ab61-54caa1980602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b953c4a0-b94e-4a1a-82ed-7c58e8dfb92d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 NetworkManager[54973]: <info>  [1769120279.0737] manager: (tap1b7cb047-70): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Jan 22 17:17:59 np0005592767 systemd-udevd[213332]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.101 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[14a8e3e9-0a3a-46e9-b275-513cbbf29b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.104 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[79153bf3-c94a-4762-aac0-e33107bb0f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 NetworkManager[54973]: <info>  [1769120279.1271] device (tap1b7cb047-70): carrier: link connected
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.131 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[47b9daeb-a7ec-40a7-9e92-427e9c86876d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.148 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a81f776e-456b-4de4-91cd-d8d3253921ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b7cb047-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:53:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387570, 'reachable_time': 25941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213351, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.162 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ad48cb6-479b-4d10-ba71-f6dcafc1a973]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:534f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387570, 'tstamp': 387570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213352, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.177 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[711bbafe-ed4e-49bf-bbfd-1ccf5ff67563]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b7cb047-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:53:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387570, 'reachable_time': 25941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213353, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.206 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[591dcf9c-2d47-46e4-b154-1327204f8055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.227 182627 INFO nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Post operation of migration started#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.272 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d266cf92-e71d-4697-bcb7-9056425c128c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.273 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b7cb047-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.273 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.273 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b7cb047-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:59 np0005592767 kernel: tap1b7cb047-70: entered promiscuous mode
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.275 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:59 np0005592767 NetworkManager[54973]: <info>  [1769120279.2758] manager: (tap1b7cb047-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.277 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.277 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b7cb047-70, col_values=(('external_ids', {'iface-id': 'd3f99e89-12b2-4c5f-a047-a3d3247ffb04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.278 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:17:59Z|00074|binding|INFO|Releasing lport d3f99e89-12b2-4c5f-a047-a3d3247ffb04 from this chassis (sb_readonly=0)
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.291 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.292 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.292 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[851ecad2-de60-4e74-9b05-a73e55b8bd1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.293 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-1b7cb047-7415-4b9a-be62-075d33a42dfe
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/1b7cb047-7415-4b9a-be62-075d33a42dfe.pid.haproxy
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 1b7cb047-7415-4b9a-be62-075d33a42dfe
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.293 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'env', 'PROCESS_TAG=haproxy-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b7cb047-7415-4b9a-be62-075d33a42dfe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.521 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.664 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "84d2041e-03a2-4fac-b088-240a1b0badef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:59 np0005592767 podman[213385]: 2026-01-22 22:17:59.665255464 +0000 UTC m=+0.050614705 container create f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.665 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.683 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:17:59 np0005592767 systemd[1]: Started libpod-conmon-f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e.scope.
Jan 22 17:17:59 np0005592767 podman[213385]: 2026-01-22 22:17:59.640528149 +0000 UTC m=+0.025887410 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:17:59 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:17:59 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b89b67d73593b4bb03509032e6bc357ad498c5aa634d2740df8458bc6f1b2b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:17:59 np0005592767 podman[213385]: 2026-01-22 22:17:59.75429133 +0000 UTC m=+0.139650581 container init f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:17:59 np0005592767 podman[213385]: 2026-01-22 22:17:59.761745186 +0000 UTC m=+0.147104427 container start f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:17:59 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [NOTICE]   (213404) : New worker (213406) forked
Jan 22 17:17:59 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [NOTICE]   (213404) : Loading success.
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.801 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.802 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.808 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.808 182627 INFO nova.compute.claims [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.825 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.826 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0265f228-4e11-4f15-8d77-6acb409f3f7b#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.839 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3928b765-0cba-44ec-8291-73449989544e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.840 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0265f228-41 in ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.842 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0265f228-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.842 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bec7a5ea-6242-4408-9533-151a3acbf715]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.843 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[db3fcf06-fda4-4b39-905f-ad7b62ef775d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.854 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.854 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquired lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:17:59 np0005592767 nova_compute[182623]: 2026-01-22 22:17:59.854 182627 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.857 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[2145a9d8-13eb-463a-9452-88f8e7d3f69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.884 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0af5da69-fd95-4751-84a5-573d9949b5e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.918 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b682c7-8b4d-4b03-8d24-44bdefdc7420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 NetworkManager[54973]: <info>  [1769120279.9258] manager: (tap0265f228-40): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.925 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e19165a-5d71-4043-91b0-5c399b6ad969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 systemd-udevd[213341]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.953 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a906481d-8b29-45ef-bfa2-cc6ce3f335fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.956 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb43100-fae3-43a2-ae7b-57bf4d32e495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:17:59 np0005592767 NetworkManager[54973]: <info>  [1769120279.9773] device (tap0265f228-40): carrier: link connected
Jan 22 17:17:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:17:59.982 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb13987-06c1-4568-be2d-e3c71e257634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.000 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6960db-fa95-4588-a0a2-eee0964a59ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387655, 'reachable_time': 36877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213425, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.018 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b4834fae-d814-4741-9499-d194ec95adb2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8003'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387655, 'tstamp': 387655}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213426, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.020 182627 DEBUG nova.compute.provider_tree [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.033 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c26ef8-e225-4673-8098-2aa7a903623f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0265f228-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:80:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387655, 'reachable_time': 36877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213427, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.035 182627 DEBUG nova.scheduler.client.report [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.055 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.056 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.074 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[866b70b6-e857-4aa9-9eea-60a43ca01f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.105 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.105 182627 DEBUG nova.network.neutron [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.124 182627 INFO nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.130 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5cbf714b-337e-478a-a58b-49fbf1aa8f68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.132 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.132 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.132 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0265f228-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.134 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:00 np0005592767 kernel: tap0265f228-40: entered promiscuous mode
Jan 22 17:18:00 np0005592767 NetworkManager[54973]: <info>  [1769120280.1353] manager: (tap0265f228-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.137 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.139 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0265f228-40, col_values=(('external_ids', {'iface-id': '7a6b2843-0304-440c-ac2a-e8d7f0e704c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.140 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:00 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:00Z|00075|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.141 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.141 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.143 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d91204a8-e171-4017-89ac-adc4ef9321e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.144 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/0265f228-4e11-4f15-8d77-6acb409f3f7b.pid.haproxy
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 0265f228-4e11-4f15-8d77-6acb409f3f7b
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:18:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:00.144 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'env', 'PROCESS_TAG=haproxy-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0265f228-4e11-4f15-8d77-6acb409f3f7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.148 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.264 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.265 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.266 182627 INFO nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Creating image(s)#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.266 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.267 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.267 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.284 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.403 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.404 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.409 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.420 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.489 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.490 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.541 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.543 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.544 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:00 np0005592767 podman[213463]: 2026-01-22 22:18:00.554161911 +0000 UTC m=+0.063415375 container create 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 17:18:00 np0005592767 systemd[1]: Started libpod-conmon-1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f.scope.
Jan 22 17:18:00 np0005592767 podman[213463]: 2026-01-22 22:18:00.520910459 +0000 UTC m=+0.030163953 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.613 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.614 182627 DEBUG nova.virt.disk.api [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Checking if we can resize image /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.615 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:00 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:18:00 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e89aa6f5b7df1e0331cdfa044f2ed96cd6db5a84f1b0ce1cf349bd0f80e591/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:18:00 np0005592767 podman[213463]: 2026-01-22 22:18:00.637859513 +0000 UTC m=+0.147112997 container init 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.642 182627 DEBUG nova.network.neutron [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.642 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:18:00 np0005592767 podman[213463]: 2026-01-22 22:18:00.643216818 +0000 UTC m=+0.152470272 container start 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:18:00 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [NOTICE]   (213501) : New worker (213505) forked
Jan 22 17:18:00 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [NOTICE]   (213501) : Loading success.
Jan 22 17:18:00 np0005592767 podman[213485]: 2026-01-22 22:18:00.700864856 +0000 UTC m=+0.084581569 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.710 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.711 182627 DEBUG nova.virt.disk.api [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Cannot resize image /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.711 182627 DEBUG nova.objects.instance [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 84d2041e-03a2-4fac-b088-240a1b0badef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.726 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.726 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Ensure instance console log exists: /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.727 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.727 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.727 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.729 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.734 182627 WARNING nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.738 182627 DEBUG nova.virt.libvirt.host [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.739 182627 DEBUG nova.virt.libvirt.host [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.743 182627 DEBUG nova.virt.libvirt.host [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.743 182627 DEBUG nova.virt.libvirt.host [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.744 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.745 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.745 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.745 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.745 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.745 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.746 182627 DEBUG nova.virt.hardware [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.750 182627 DEBUG nova.objects.instance [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 84d2041e-03a2-4fac-b088-240a1b0badef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.760 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <uuid>84d2041e-03a2-4fac-b088-240a1b0badef</uuid>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <name>instance-00000010</name>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-650190103</nova:name>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:18:00</nova:creationTime>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:user uuid="b221c6d745ca427fa43417e878aeacbf">tempest-ServerDiagnosticsNegativeTest-2092721669-project-member</nova:user>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:        <nova:project uuid="5e17e6c264454090b8b05b2a005921d8">tempest-ServerDiagnosticsNegativeTest-2092721669</nova:project>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="serial">84d2041e-03a2-4fac-b088-240a1b0badef</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="uuid">84d2041e-03a2-4fac-b088-240a1b0badef</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.config"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/console.log" append="off"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:18:00 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:18:00 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:18:00 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:18:00 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.813 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.814 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:00 np0005592767 nova_compute[182623]: 2026-01-22 22:18:00.814 182627 INFO nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Using config drive#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.261 182627 INFO nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Creating config drive at /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.config#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.265 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa8akvg8u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.389 182627 DEBUG oslo_concurrency.processutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa8akvg8u" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:01 np0005592767 systemd-machined[153912]: New machine qemu-8-instance-00000010.
Jan 22 17:18:01 np0005592767 systemd[1]: Started Virtual Machine qemu-8-instance-00000010.
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.661 182627 DEBUG nova.network.neutron [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [{"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.676 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Releasing lock "refresh_cache-eb864a01-1633-42f3-ac5f-4d664cc5d477" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.703 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.704 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.704 182627 DEBUG oslo_concurrency.lockutils [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:01 np0005592767 nova_compute[182623]: 2026-01-22 22:18:01.712 182627 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 22 17:18:01 np0005592767 virtqemud[182095]: Domain id=7 name='instance-0000000e' uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477 is tainted: custom-monitor
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.005 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120282.0056007, 84d2041e-03a2-4fac-b088-240a1b0badef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.006 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.008 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.008 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.011 182627 INFO nova.virt.libvirt.driver [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance spawned successfully.#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.011 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.029 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.031 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.031 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.031 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.032 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.032 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.033 182627 DEBUG nova.virt.libvirt.driver [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.037 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.068 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.068 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120282.0086212, 84d2041e-03a2-4fac-b088-240a1b0badef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.069 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] VM Started (Lifecycle Event)#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.085 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.088 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.111 182627 INFO nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Took 1.85 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.111 182627 DEBUG nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.113 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.262 182627 INFO nova.compute.manager [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Took 2.52 seconds to build instance.#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.278 182627 DEBUG oslo_concurrency.lockutils [None req-fef090d2-9d7b-4b8c-8a4b-3d0288d1eb00 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.541 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:02 np0005592767 nova_compute[182623]: 2026-01-22 22:18:02.721 182627 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.729 182627 INFO nova.virt.libvirt.driver [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.733 182627 DEBUG nova.compute.manager [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.768 182627 DEBUG nova.objects.instance [None req-111dfae6-5483-4100-b12d-b4f57ac5ec85 f504c4b3e01547a98bde37c90625e9cd 94534127dbe741c9a9ce39862ff15bc8 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.977 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "84d2041e-03a2-4fac-b088-240a1b0badef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.978 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.978 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "84d2041e-03a2-4fac-b088-240a1b0badef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.979 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.979 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:03 np0005592767 nova_compute[182623]: 2026-01-22 22:18:03.990 182627 INFO nova.compute.manager [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Terminating instance#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.000 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "refresh_cache-84d2041e-03a2-4fac-b088-240a1b0badef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.000 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquired lock "refresh_cache-84d2041e-03a2-4fac-b088-240a1b0badef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.000 182627 DEBUG nova.network.neutron [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.220 182627 DEBUG nova.network.neutron [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.517 182627 DEBUG nova.network.neutron [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.525 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.542 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Releasing lock "refresh_cache-84d2041e-03a2-4fac-b088-240a1b0badef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.543 182627 DEBUG nova.compute.manager [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:18:04 np0005592767 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 22 17:18:04 np0005592767 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000010.scope: Consumed 3.063s CPU time.
Jan 22 17:18:04 np0005592767 systemd-machined[153912]: Machine qemu-8-instance-00000010 terminated.
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.600 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.791 182627 INFO nova.virt.libvirt.driver [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance destroyed successfully.#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.792 182627 DEBUG nova.objects.instance [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lazy-loading 'resources' on Instance uuid 84d2041e-03a2-4fac-b088-240a1b0badef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.810 182627 INFO nova.virt.libvirt.driver [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Deleting instance files /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef_del#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.811 182627 INFO nova.virt.libvirt.driver [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Deletion of /var/lib/nova/instances/84d2041e-03a2-4fac-b088-240a1b0badef_del complete#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.884 182627 INFO nova.compute.manager [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.886 182627 DEBUG oslo.service.loopingcall [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.886 182627 DEBUG nova.compute.manager [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.886 182627 DEBUG nova.network.neutron [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.896 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120269.895517, 888fa71f-f52d-41c5-8814-4e0b8670b601 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.896 182627 INFO nova.compute.manager [-] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:04 np0005592767 nova_compute[182623]: 2026-01-22 22:18:04.920 182627 DEBUG nova.compute.manager [None req-34b22dba-337e-46d6-b3cf-0575fc7c4dcf - - - - - -] [instance: 888fa71f-f52d-41c5-8814-4e0b8670b601] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.153 182627 DEBUG nova.network.neutron [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.174 182627 DEBUG nova.network.neutron [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.198 182627 INFO nova.compute.manager [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Took 0.31 seconds to deallocate network for instance.#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.284 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.284 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.368 182627 DEBUG nova.compute.provider_tree [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.384 182627 DEBUG nova.scheduler.client.report [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.404 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.433 182627 INFO nova.scheduler.client.report [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Deleted allocations for instance 84d2041e-03a2-4fac-b088-240a1b0badef#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.529 182627 DEBUG oslo_concurrency.lockutils [None req-48a3795b-65f2-47f4-9ab9-0b261290b749 b221c6d745ca427fa43417e878aeacbf 5e17e6c264454090b8b05b2a005921d8 - - default default] Lock "84d2041e-03a2-4fac-b088-240a1b0badef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.811 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.812 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.812 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.812 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.812 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.825 182627 INFO nova.compute.manager [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Terminating instance#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.839 182627 DEBUG nova.compute.manager [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:18:05 np0005592767 kernel: tap1d0bf445-f7 (unregistering): left promiscuous mode
Jan 22 17:18:05 np0005592767 NetworkManager[54973]: <info>  [1769120285.8634] device (tap1d0bf445-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00076|binding|INFO|Releasing lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc from this chassis (sb_readonly=0)
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00077|binding|INFO|Setting lport 1d0bf445-f745-430d-9927-a3d8cdc9b6fc down in Southbound
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00078|binding|INFO|Releasing lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d from this chassis (sb_readonly=0)
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.867 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00079|binding|INFO|Setting lport 9927fe61-75e1-4c06-8f4c-ccc8597a433d down in Southbound
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00080|binding|INFO|Removing iface tap1d0bf445-f7 ovn-installed in OVS
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.871 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00081|binding|INFO|Releasing lport d3f99e89-12b2-4c5f-a047-a3d3247ffb04 from this chassis (sb_readonly=0)
Jan 22 17:18:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:05Z|00082|binding|INFO|Releasing lport 7a6b2843-0304-440c-ac2a-e8d7f0e704c0 from this chassis (sb_readonly=0)
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.874 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:78:60 19.80.0.76'], port_security=['fa:16:3e:1b:78:60 19.80.0.76'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['1d0bf445-f745-430d-9927-a3d8cdc9b6fc'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2040814554', 'neutron:cidrs': '19.80.0.76/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2040814554', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52c62c6f-61b3-4b60-8745-b12d4e251f43, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9927fe61-75e1-4c06-8f4c-ccc8597a433d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.876 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:be:3d 10.100.0.6'], port_security=['fa:16:3e:83:be:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-835502342', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb864a01-1633-42f3-ac5f-4d664cc5d477', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-835502342', 'neutron:project_id': '4ff5f7f17f1c471986dfd67f5192359f', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e90ac107-524e-4322-b8d2-b17275d5934e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4788d2e5-8558-45c0-aad9-8b763d575591, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=1d0bf445-f745-430d-9927-a3d8cdc9b6fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.877 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 9927fe61-75e1-4c06-8f4c-ccc8597a433d in datapath 1b7cb047-7415-4b9a-be62-075d33a42dfe unbound from our chassis#033[00m
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.878 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b7cb047-7415-4b9a-be62-075d33a42dfe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.879 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c0d588-3c1c-4c07-9c62-aa7eb7774f25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:05.880 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe namespace which is not needed anymore#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.894 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:05 np0005592767 nova_compute[182623]: 2026-01-22 22:18:05.952 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:05 np0005592767 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 22 17:18:05 np0005592767 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 1.869s CPU time.
Jan 22 17:18:05 np0005592767 systemd-machined[153912]: Machine qemu-7-instance-0000000e terminated.
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [NOTICE]   (213404) : haproxy version is 2.8.14-c23fe91
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [NOTICE]   (213404) : path to executable is /usr/sbin/haproxy
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [WARNING]  (213404) : Exiting Master process...
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [WARNING]  (213404) : Exiting Master process...
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [ALERT]    (213404) : Current worker (213406) exited with code 143 (Terminated)
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe[213400]: [WARNING]  (213404) : All workers exited. Exiting... (0)
Jan 22 17:18:06 np0005592767 systemd[1]: libpod-f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e.scope: Deactivated successfully.
Jan 22 17:18:06 np0005592767 podman[213586]: 2026-01-22 22:18:06.024172701 +0000 UTC m=+0.042872212 container died f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:18:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e-userdata-shm.mount: Deactivated successfully.
Jan 22 17:18:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay-2b89b67d73593b4bb03509032e6bc357ad498c5aa634d2740df8458bc6f1b2b0-merged.mount: Deactivated successfully.
Jan 22 17:18:06 np0005592767 podman[213586]: 2026-01-22 22:18:06.059525603 +0000 UTC m=+0.078225114 container cleanup f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:18:06 np0005592767 NetworkManager[54973]: <info>  [1769120286.0607] manager: (tap1d0bf445-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 22 17:18:06 np0005592767 systemd[1]: libpod-conmon-f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e.scope: Deactivated successfully.
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.103 182627 INFO nova.virt.libvirt.driver [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Instance destroyed successfully.#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.104 182627 DEBUG nova.objects.instance [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lazy-loading 'resources' on Instance uuid eb864a01-1633-42f3-ac5f-4d664cc5d477 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.119 182627 DEBUG nova.virt.libvirt.vif [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:17:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1892112726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1892112726',id=14,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:17:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4ff5f7f17f1c471986dfd67f5192359f',ramdisk_id='',reservation_id='r-m40501p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1833907945',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1833907945-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:18:03Z,user_data=None,user_id='f591d36af603475bbc613d6c93854a42',uuid=eb864a01-1633-42f3-ac5f-4d664cc5d477,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.119 182627 DEBUG nova.network.os_vif_util [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converting VIF {"id": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "address": "fa:16:3e:83:be:3d", "network": {"id": "0265f228-4e11-4f15-8d77-6acb409f3f7b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1563559322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4ff5f7f17f1c471986dfd67f5192359f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d0bf445-f7", "ovs_interfaceid": "1d0bf445-f745-430d-9927-a3d8cdc9b6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.120 182627 DEBUG nova.network.os_vif_util [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.120 182627 DEBUG os_vif [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.124 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d0bf445-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.130 182627 INFO os_vif [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:be:3d,bridge_name='br-int',has_traffic_filtering=True,id=1d0bf445-f745-430d-9927-a3d8cdc9b6fc,network=Network(0265f228-4e11-4f15-8d77-6acb409f3f7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1d0bf445-f7')#033[00m
Jan 22 17:18:06 np0005592767 podman[213625]: 2026-01-22 22:18:06.131870456 +0000 UTC m=+0.047166525 container remove f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.132 182627 INFO nova.virt.libvirt.driver [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Deleting instance files /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477_del#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.133 182627 INFO nova.virt.libvirt.driver [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Deletion of /var/lib/nova/instances/eb864a01-1633-42f3-ac5f-4d664cc5d477_del complete#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.137 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3671c721-c8ea-418c-b18c-498a6f57de53]: (4, ('Thu Jan 22 10:18:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe (f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e)\nf191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e\nThu Jan 22 10:18:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe (f191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e)\nf191aaab96e6668ae17d87705a6b6b6f9a3dd6201811cb5c99fd3bf7b807270e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0c4094-59e2-4feb-ad0c-585c1909d51b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.139 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b7cb047-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:06 np0005592767 kernel: tap1b7cb047-70: left promiscuous mode
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.141 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.152 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.153 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5b4576-d5e7-438e-b993-3d36cd828de4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.168 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[106e0163-efa8-49ac-b6db-f572a437ad54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.169 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8a82bfc5-6b12-46ac-84ab-efa58d9f9ea4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.173 182627 DEBUG nova.compute.manager [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.173 182627 DEBUG oslo_concurrency.lockutils [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.173 182627 DEBUG oslo_concurrency.lockutils [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.174 182627 DEBUG oslo_concurrency.lockutils [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.174 182627 DEBUG nova.compute.manager [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.174 182627 DEBUG nova.compute.manager [req-f81372f1-cd20-4fa5-a059-676c82b3fca5 req-8a3eb8db-d8cf-4a9b-b60e-8c2b102f7bf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-unplugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.187 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[88f6baa8-1778-4b4a-adde-6e9e3a61ce79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387564, 'reachable_time': 40871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213649, 'error': None, 'target': 'ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 systemd[1]: run-netns-ovnmeta\x2d1b7cb047\x2d7415\x2d4b9a\x2dbe62\x2d075d33a42dfe.mount: Deactivated successfully.
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.191 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b7cb047-7415-4b9a-be62-075d33a42dfe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.191 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[faf80d70-b436-4b41-9c7d-ed5bb3981731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.193 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 1d0bf445-f745-430d-9927-a3d8cdc9b6fc in datapath 0265f228-4e11-4f15-8d77-6acb409f3f7b unbound from our chassis#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.194 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0265f228-4e11-4f15-8d77-6acb409f3f7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.195 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[67c7a9dc-7be6-47e5-9609-8dbe9d4d4849]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.195 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b namespace which is not needed anymore#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.215 182627 INFO nova.compute.manager [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.215 182627 DEBUG oslo.service.loopingcall [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.216 182627 DEBUG nova.compute.manager [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.216 182627 DEBUG nova.network.neutron [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [NOTICE]   (213501) : haproxy version is 2.8.14-c23fe91
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [NOTICE]   (213501) : path to executable is /usr/sbin/haproxy
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [WARNING]  (213501) : Exiting Master process...
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [WARNING]  (213501) : Exiting Master process...
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [ALERT]    (213501) : Current worker (213505) exited with code 143 (Terminated)
Jan 22 17:18:06 np0005592767 neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b[213486]: [WARNING]  (213501) : All workers exited. Exiting... (0)
Jan 22 17:18:06 np0005592767 systemd[1]: libpod-1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f.scope: Deactivated successfully.
Jan 22 17:18:06 np0005592767 podman[213665]: 2026-01-22 22:18:06.325967392 +0000 UTC m=+0.045987692 container died 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:18:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:18:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay-75e89aa6f5b7df1e0331cdfa044f2ed96cd6db5a84f1b0ce1cf349bd0f80e591-merged.mount: Deactivated successfully.
Jan 22 17:18:06 np0005592767 podman[213665]: 2026-01-22 22:18:06.364938109 +0000 UTC m=+0.084958389 container cleanup 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:06 np0005592767 systemd[1]: libpod-conmon-1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f.scope: Deactivated successfully.
Jan 22 17:18:06 np0005592767 podman[213694]: 2026-01-22 22:18:06.430229027 +0000 UTC m=+0.041221082 container remove 1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.435 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cab4748c-875a-42fa-bc56-d084119d96b1]: (4, ('Thu Jan 22 10:18:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f)\n1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f\nThu Jan 22 10:18:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b (1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f)\n1188392237af7a535152b86a626b4d4d52906b5cea00c101696fecebc75dd40f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.437 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d4d078-28b7-4a5f-9b08-003016ed9bb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.438 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0265f228-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 kernel: tap0265f228-40: left promiscuous mode
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.441 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.444 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d85d05a1-56a6-4f95-923a-fda05d5f4c6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.453 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.460 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e7cc9075-5a72-4a0a-a97d-383814d7f6d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.461 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb66457-23b2-406f-8202-43e65a7d9ae7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.474 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3df464-6ed2-47d8-b156-7f8d778d846f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387649, 'reachable_time': 27596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213709, 'error': None, 'target': 'ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.476 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0265f228-4e11-4f15-8d77-6acb409f3f7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:18:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:06.476 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd94e77-8cfe-4fac-9d6e-4b3e60981e4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.917 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.917 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.918 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:06 np0005592767 nova_compute[182623]: 2026-01-22 22:18:06.918 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:18:07 np0005592767 systemd[1]: run-netns-ovnmeta\x2d0265f228\x2d4e11\x2d4f15\x2d8d77\x2d6acb409f3f7b.mount: Deactivated successfully.
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:18:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:18:08 np0005592767 podman[213710]: 2026-01-22 22:18:08.13581699 +0000 UTC m=+0.058651048 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.336 182627 DEBUG nova.compute.manager [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.336 182627 DEBUG oslo_concurrency.lockutils [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.336 182627 DEBUG oslo_concurrency.lockutils [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.336 182627 DEBUG oslo_concurrency.lockutils [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.337 182627 DEBUG nova.compute.manager [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] No waiting events found dispatching network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.337 182627 WARNING nova.compute.manager [req-802b1c4d-0747-4130-85af-66d1fe0d5d7b req-4d5ce924-c5ea-4b83-9e72-7575f4e1b5d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Received unexpected event network-vif-plugged-1d0bf445-f745-430d-9927-a3d8cdc9b6fc for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.445 182627 DEBUG nova.network.neutron [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.465 182627 INFO nova.compute.manager [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Took 2.25 seconds to deallocate network for instance.#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.542 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.543 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.546 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.570 182627 INFO nova.scheduler.client.report [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Deleted allocations for instance eb864a01-1633-42f3-ac5f-4d664cc5d477#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.634 182627 DEBUG oslo_concurrency.lockutils [None req-36ecca94-c5dd-4900-8455-c95244363e00 f591d36af603475bbc613d6c93854a42 4ff5f7f17f1c471986dfd67f5192359f - - default default] Lock "eb864a01-1633-42f3-ac5f-4d664cc5d477" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.917 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.917 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.917 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:08 np0005592767 nova_compute[182623]: 2026-01-22 22:18:08.918 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.077 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.078 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5603MB free_disk=73.37931060791016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.079 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.079 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.126 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.127 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.145 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.157 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.176 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.176 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:09 np0005592767 nova_compute[182623]: 2026-01-22 22:18:09.525 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:10 np0005592767 nova_compute[182623]: 2026-01-22 22:18:10.145 182627 DEBUG oslo_concurrency.processutils [None req-745edf84-2ea1-4d58-a9e4-e4923066faa8 1c4bbb4198b54017b85a5d92d3327587 486dac1280fe467a88cae13201c5c26b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:10 np0005592767 nova_compute[182623]: 2026-01-22 22:18:10.192 182627 DEBUG oslo_concurrency.processutils [None req-745edf84-2ea1-4d58-a9e4-e4923066faa8 1c4bbb4198b54017b85a5d92d3327587 486dac1280fe467a88cae13201c5c26b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:11 np0005592767 nova_compute[182623]: 2026-01-22 22:18:11.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:12.090 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:12.091 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:12.091 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:14 np0005592767 nova_compute[182623]: 2026-01-22 22:18:14.528 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:15 np0005592767 nova_compute[182623]: 2026-01-22 22:18:15.652 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:15.652 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:18:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:15.654 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:18:16 np0005592767 nova_compute[182623]: 2026-01-22 22:18:16.135 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:19 np0005592767 nova_compute[182623]: 2026-01-22 22:18:19.530 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:19.657 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:19 np0005592767 nova_compute[182623]: 2026-01-22 22:18:19.791 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120284.788507, 84d2041e-03a2-4fac-b088-240a1b0badef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:19 np0005592767 nova_compute[182623]: 2026-01-22 22:18:19.791 182627 INFO nova.compute.manager [-] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:18:19 np0005592767 nova_compute[182623]: 2026-01-22 22:18:19.811 182627 DEBUG nova.compute.manager [None req-31943984-0d06-4701-aebf-2b6b6c0e82a6 - - - - - -] [instance: 84d2041e-03a2-4fac-b088-240a1b0badef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:20 np0005592767 podman[213737]: 2026-01-22 22:18:20.137800862 +0000 UTC m=+0.056978469 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:18:21 np0005592767 nova_compute[182623]: 2026-01-22 22:18:21.102 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120286.1005044, eb864a01-1633-42f3-ac5f-4d664cc5d477 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:21 np0005592767 nova_compute[182623]: 2026-01-22 22:18:21.102 182627 INFO nova.compute.manager [-] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:18:21 np0005592767 nova_compute[182623]: 2026-01-22 22:18:21.119 182627 DEBUG nova.compute.manager [None req-4d55abed-d5c2-47f9-856a-fb75ecd1c657 - - - - - -] [instance: eb864a01-1633-42f3-ac5f-4d664cc5d477] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:21 np0005592767 nova_compute[182623]: 2026-01-22 22:18:21.140 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:23 np0005592767 podman[213757]: 2026-01-22 22:18:23.136061733 +0000 UTC m=+0.050010928 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:18:23 np0005592767 podman[213756]: 2026-01-22 22:18:23.165995309 +0000 UTC m=+0.084322261 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:18:24 np0005592767 nova_compute[182623]: 2026-01-22 22:18:24.531 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:26 np0005592767 nova_compute[182623]: 2026-01-22 22:18:26.144 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:29 np0005592767 podman[213797]: 2026-01-22 22:18:29.144230162 +0000 UTC m=+0.063697614 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:18:29 np0005592767 nova_compute[182623]: 2026-01-22 22:18:29.533 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:31 np0005592767 podman[213816]: 2026-01-22 22:18:31.120303742 +0000 UTC m=+0.044906101 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:18:31 np0005592767 nova_compute[182623]: 2026-01-22 22:18:31.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.594 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.594 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.614 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.721 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.722 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.729 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.730 182627 INFO nova.compute.claims [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.920 182627 DEBUG nova.compute.provider_tree [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.937 182627 DEBUG nova.scheduler.client.report [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.989 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:32 np0005592767 nova_compute[182623]: 2026-01-22 22:18:32.990 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.070 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.070 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.092 182627 INFO nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.109 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.248 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.249 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.250 182627 INFO nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating image(s)#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.250 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.251 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.251 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.264 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.321 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.323 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.323 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.337 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.378 182627 DEBUG nova.policy [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f23ea0c335b84bd2b78725d5a5491d0a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '214876cdc63543458d35ee214fe21b2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.392 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.393 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.423 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.424 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.425 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.478 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.480 182627 DEBUG nova.virt.disk.api [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Checking if we can resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.480 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.536 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.537 182627 DEBUG nova.virt.disk.api [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Cannot resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.537 182627 DEBUG nova.objects.instance [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'migration_context' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.553 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.554 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Ensure instance console log exists: /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.555 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.555 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:33 np0005592767 nova_compute[182623]: 2026-01-22 22:18:33.555 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:34 np0005592767 nova_compute[182623]: 2026-01-22 22:18:34.139 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Successfully created port: 648c69ef-5bab-43c9-99a7-4b49b3122d56 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:18:34 np0005592767 nova_compute[182623]: 2026-01-22 22:18:34.535 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:35 np0005592767 nova_compute[182623]: 2026-01-22 22:18:35.923 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Successfully updated port: 648c69ef-5bab-43c9-99a7-4b49b3122d56 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:18:35 np0005592767 nova_compute[182623]: 2026-01-22 22:18:35.960 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:18:35 np0005592767 nova_compute[182623]: 2026-01-22 22:18:35.961 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquired lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:18:35 np0005592767 nova_compute[182623]: 2026-01-22 22:18:35.961 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:18:36 np0005592767 nova_compute[182623]: 2026-01-22 22:18:36.039 182627 DEBUG nova.compute.manager [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-changed-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:36 np0005592767 nova_compute[182623]: 2026-01-22 22:18:36.039 182627 DEBUG nova.compute.manager [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Refreshing instance network info cache due to event network-changed-648c69ef-5bab-43c9-99a7-4b49b3122d56. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:18:36 np0005592767 nova_compute[182623]: 2026-01-22 22:18:36.040 182627 DEBUG oslo_concurrency.lockutils [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:18:36 np0005592767 nova_compute[182623]: 2026-01-22 22:18:36.110 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:18:36 np0005592767 nova_compute[182623]: 2026-01-22 22:18:36.178 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.349 182627 DEBUG nova.network.neutron [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updating instance_info_cache with network_info: [{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.368 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Releasing lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.368 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance network_info: |[{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.369 182627 DEBUG oslo_concurrency.lockutils [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.369 182627 DEBUG nova.network.neutron [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Refreshing network info cache for port 648c69ef-5bab-43c9-99a7-4b49b3122d56 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.371 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start _get_guest_xml network_info=[{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.374 182627 WARNING nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.378 182627 DEBUG nova.virt.libvirt.host [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.378 182627 DEBUG nova.virt.libvirt.host [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.380 182627 DEBUG nova.virt.libvirt.host [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.380 182627 DEBUG nova.virt.libvirt.host [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.381 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.382 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.382 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.382 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.383 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.383 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.383 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.383 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.383 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.384 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.384 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.384 182627 DEBUG nova.virt.hardware [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.387 182627 DEBUG nova.virt.libvirt.vif [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:33Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.388 182627 DEBUG nova.network.os_vif_util [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.389 182627 DEBUG nova.network.os_vif_util [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.390 182627 DEBUG nova.objects.instance [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_devices' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.508 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <uuid>e6db2ef0-a660-4d03-8a2d-9574e7af17d4</uuid>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <name>instance-00000012</name>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersAdminTestJSON-server-1481617455</nova:name>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:18:37</nova:creationTime>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:user uuid="f23ea0c335b84bd2b78725d5a5491d0a">tempest-ServersAdminTestJSON-1825362070-project-member</nova:user>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:project uuid="214876cdc63543458d35ee214fe21b2c">tempest-ServersAdminTestJSON-1825362070</nova:project>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        <nova:port uuid="648c69ef-5bab-43c9-99a7-4b49b3122d56">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="serial">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="uuid">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:ea:a6:7a"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <target dev="tap648c69ef-5b"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log" append="off"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:18:37 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:18:37 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:18:37 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:18:37 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.509 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Preparing to wait for external event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.509 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.510 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.510 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.510 182627 DEBUG nova.virt.libvirt.vif [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:33Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.511 182627 DEBUG nova.network.os_vif_util [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.511 182627 DEBUG nova.network.os_vif_util [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.512 182627 DEBUG os_vif [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.512 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.512 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.513 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.515 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.515 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap648c69ef-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.515 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap648c69ef-5b, col_values=(('external_ids', {'iface-id': '648c69ef-5bab-43c9-99a7-4b49b3122d56', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:a6:7a', 'vm-uuid': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.516 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:37 np0005592767 NetworkManager[54973]: <info>  [1769120317.5176] manager: (tap648c69ef-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.520 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.525 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.526 182627 INFO os_vif [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.570 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.571 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.571 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No VIF found with MAC fa:16:3e:ea:a6:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:18:37 np0005592767 nova_compute[182623]: 2026-01-22 22:18:37.571 182627 INFO nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Using config drive#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.040 182627 INFO nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating config drive at /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.045 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwbexd9lx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.166 182627 DEBUG oslo_concurrency.processutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwbexd9lx" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:38 np0005592767 kernel: tap648c69ef-5b: entered promiscuous mode
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.2285] manager: (tap648c69ef-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 22 17:18:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:38Z|00083|binding|INFO|Claiming lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 for this chassis.
Jan 22 17:18:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:38Z|00084|binding|INFO|648c69ef-5bab-43c9-99a7-4b49b3122d56: Claiming fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.234 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.240 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.247 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.248 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c bound to our chassis#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.250 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.266 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8e28e270-d326-4657-a58a-3c53bdbe34b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.289 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19dd816f-61 in ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.292 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19dd816f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.292 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b9fe50-9759-4d92-b6fe-363547b83810]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.293 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfd6db6-bf9f-4d53-8f3a-0b8dc1507d03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 systemd-machined[153912]: New machine qemu-9-instance-00000012.
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.305 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf43d20-d0cd-4003-93eb-23e11020aa34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 systemd-udevd[213890]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:18:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:38Z|00085|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 ovn-installed in OVS
Jan 22 17:18:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:38Z|00086|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 up in Southbound
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.308 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 systemd[1]: Started Virtual Machine qemu-9-instance-00000012.
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.3199] device (tap648c69ef-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.3207] device (tap648c69ef-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:18:38 np0005592767 podman[213868]: 2026-01-22 22:18:38.328142856 +0000 UTC m=+0.095738860 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.329 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b9da56d5-ed8e-4fa2-a23e-0f8b487af02c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.353 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0016a315-a808-4249-b07e-60711e1cc15e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.3582] manager: (tap19dd816f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.358 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1ca1e3-c0df-4358-984a-8100736e0499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.385 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fce45291-55fe-41f1-9e88-47b7499709da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.388 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[441a29ce-ee9f-483d-ba07-24c349ebcd10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.4087] device (tap19dd816f-60): carrier: link connected
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.412 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[128684e6-d69d-428f-b410-d8bccc3b8416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.429 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[124539c0-765c-4066-b738-a0f3789a08fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213931, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.441 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c143bc7-6503-4c59-a71d-1c2f86617329]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:7247'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391498, 'tstamp': 391498}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213932, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.456 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[04a7c4be-070a-447f-b1be-758089149b30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213933, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.480 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ace5a0-e4ce-4bb8-b09e-8b48cc381b5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.526 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe24408-e9e4-46d4-81fa-b59761c8d29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.528 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.528 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.528 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.567 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 kernel: tap19dd816f-60: entered promiscuous mode
Jan 22 17:18:38 np0005592767 NetworkManager[54973]: <info>  [1769120318.5713] manager: (tap19dd816f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.571 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.572 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:38Z|00087|binding|INFO|Releasing lport 32bed344-462e-4b45-8eb9-1fd48f73f73c from this chassis (sb_readonly=0)
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.573 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.574 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[407b9de9-08d7-4833-ba97-c09bfcf79763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.574 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-19dd816f-669a-4bda-b508-a3ddcd4c2d7c
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.pid.haproxy
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 19dd816f-669a-4bda-b508-a3ddcd4c2d7c
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:18:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:38.575 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'env', 'PROCESS_TAG=haproxy-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19dd816f-669a-4bda-b508-a3ddcd4c2d7c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.584 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.678 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120318.6776938, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.678 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.700 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.704 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120318.6777508, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.704 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.744 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.747 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.773 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:38 np0005592767 podman[213972]: 2026-01-22 22:18:38.934509658 +0000 UTC m=+0.062649834 container create 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.942 182627 DEBUG nova.compute.manager [req-4620165e-752a-44b8-b253-65170debef0b req-dc581f95-a7a1-4e82-9270-ac51a777e5a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.944 182627 DEBUG oslo_concurrency.lockutils [req-4620165e-752a-44b8-b253-65170debef0b req-dc581f95-a7a1-4e82-9270-ac51a777e5a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.944 182627 DEBUG oslo_concurrency.lockutils [req-4620165e-752a-44b8-b253-65170debef0b req-dc581f95-a7a1-4e82-9270-ac51a777e5a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.944 182627 DEBUG oslo_concurrency.lockutils [req-4620165e-752a-44b8-b253-65170debef0b req-dc581f95-a7a1-4e82-9270-ac51a777e5a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.945 182627 DEBUG nova.compute.manager [req-4620165e-752a-44b8-b253-65170debef0b req-dc581f95-a7a1-4e82-9270-ac51a777e5a2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Processing event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.945 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.950 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120318.949845, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.950 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.952 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.957 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance spawned successfully.#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.959 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:18:38 np0005592767 systemd[1]: Started libpod-conmon-33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a.scope.
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.971 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.974 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.982 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.983 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.983 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.984 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.984 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 nova_compute[182623]: 2026-01-22 22:18:38.985 182627 DEBUG nova.virt.libvirt.driver [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:38 np0005592767 podman[213972]: 2026-01-22 22:18:38.890750872 +0000 UTC m=+0.018891088 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:18:38 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:18:38 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/086ab037f8e24150776b7079183c8b007a36dce03c95b0d49b2e7350dd1b76bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:18:39 np0005592767 podman[213972]: 2026-01-22 22:18:39.010498556 +0000 UTC m=+0.138638772 container init 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.009 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:39 np0005592767 podman[213972]: 2026-01-22 22:18:39.015889292 +0000 UTC m=+0.144029478 container start 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:18:39 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [NOTICE]   (213991) : New worker (213993) forked
Jan 22 17:18:39 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [NOTICE]   (213991) : Loading success.
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.066 182627 INFO nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Took 5.82 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.066 182627 DEBUG nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.070 182627 DEBUG nova.network.neutron [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updated VIF entry in instance network info cache for port 648c69ef-5bab-43c9-99a7-4b49b3122d56. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.070 182627 DEBUG nova.network.neutron [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updating instance_info_cache with network_info: [{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.102 182627 DEBUG oslo_concurrency.lockutils [req-d6eaffce-19bb-4b5e-afca-150ffeb87fd5 req-2b7df12a-5312-47f6-ab05-b2b5c6015881 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.168 182627 INFO nova.compute.manager [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Took 6.49 seconds to build instance.#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.189 182627 DEBUG oslo_concurrency.lockutils [None req-3b35ef3a-c3c2-4f1b-824b-9caf71898546 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:39 np0005592767 nova_compute[182623]: 2026-01-22 22:18:39.536 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.178 182627 DEBUG nova.compute.manager [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.178 182627 DEBUG oslo_concurrency.lockutils [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.178 182627 DEBUG oslo_concurrency.lockutils [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.179 182627 DEBUG oslo_concurrency.lockutils [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.227 182627 DEBUG nova.compute.manager [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:18:41 np0005592767 nova_compute[182623]: 2026-01-22 22:18:41.227 182627 WARNING nova.compute.manager [req-7a31ef81-b8ed-4862-8d8e-53016e7406d1 req-91d586a4-4e52-40da-b861-8d2a96a3fb44 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:18:42 np0005592767 nova_compute[182623]: 2026-01-22 22:18:42.519 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:44 np0005592767 nova_compute[182623]: 2026-01-22 22:18:44.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.705 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.705 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.734 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.881 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.881 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.887 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:18:45 np0005592767 nova_compute[182623]: 2026-01-22 22:18:45.887 182627 INFO nova.compute.claims [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.046 182627 DEBUG nova.compute.provider_tree [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.067 182627 DEBUG nova.scheduler.client.report [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.094 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.095 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.308 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.309 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.337 182627 INFO nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.365 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.469 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.471 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.471 182627 INFO nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Creating image(s)#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.472 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.473 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.474 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.494 182627 DEBUG nova.policy [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f23ea0c335b84bd2b78725d5a5491d0a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '214876cdc63543458d35ee214fe21b2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.497 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.559 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.560 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.561 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.575 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.633 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.634 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.669 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.670 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.671 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.728 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.729 182627 DEBUG nova.virt.disk.api [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Checking if we can resize image /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.730 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.790 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.791 182627 DEBUG nova.virt.disk.api [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Cannot resize image /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.792 182627 DEBUG nova.objects.instance [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'migration_context' on Instance uuid 2925f68f-5cfe-47c2-b952-de9856d8ab82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.807 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.807 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Ensure instance console log exists: /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.808 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.808 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:46 np0005592767 nova_compute[182623]: 2026-01-22 22:18:46.808 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:47 np0005592767 nova_compute[182623]: 2026-01-22 22:18:47.109 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Successfully created port: 598d7930-e98a-4d8d-b339-a3edf37f15dd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:18:47 np0005592767 nova_compute[182623]: 2026-01-22 22:18:47.521 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.313 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Successfully updated port: 598d7930-e98a-4d8d-b339-a3edf37f15dd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.335 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.335 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquired lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.335 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.465 182627 DEBUG nova.compute.manager [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-changed-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.466 182627 DEBUG nova.compute.manager [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Refreshing instance network info cache due to event network-changed-598d7930-e98a-4d8d-b339-a3edf37f15dd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.479 182627 DEBUG oslo_concurrency.lockutils [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:18:48 np0005592767 nova_compute[182623]: 2026-01-22 22:18:48.553 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.341 182627 DEBUG nova.network.neutron [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Updating instance_info_cache with network_info: [{"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.362 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Releasing lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.362 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Instance network_info: |[{"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.363 182627 DEBUG oslo_concurrency.lockutils [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.363 182627 DEBUG nova.network.neutron [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Refreshing network info cache for port 598d7930-e98a-4d8d-b339-a3edf37f15dd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.366 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Start _get_guest_xml network_info=[{"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.370 182627 WARNING nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.374 182627 DEBUG nova.virt.libvirt.host [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.374 182627 DEBUG nova.virt.libvirt.host [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.376 182627 DEBUG nova.virt.libvirt.host [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.377 182627 DEBUG nova.virt.libvirt.host [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.379 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.379 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.379 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.379 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.380 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.380 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.380 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.380 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.380 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.381 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.381 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.381 182627 DEBUG nova.virt.hardware [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.386 182627 DEBUG nova.virt.libvirt.vif [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1179523829',display_name='tempest-ServersAdminTestJSON-server-1179523829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1179523829',id=21,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-6vn4bppz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:46Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=2925f68f-5cfe-47c2-b952-de9856d8ab82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.386 182627 DEBUG nova.network.os_vif_util [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.387 182627 DEBUG nova.network.os_vif_util [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.388 182627 DEBUG nova.objects.instance [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2925f68f-5cfe-47c2-b952-de9856d8ab82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.411 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <uuid>2925f68f-5cfe-47c2-b952-de9856d8ab82</uuid>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <name>instance-00000015</name>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersAdminTestJSON-server-1179523829</nova:name>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:18:49</nova:creationTime>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:user uuid="f23ea0c335b84bd2b78725d5a5491d0a">tempest-ServersAdminTestJSON-1825362070-project-member</nova:user>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:project uuid="214876cdc63543458d35ee214fe21b2c">tempest-ServersAdminTestJSON-1825362070</nova:project>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        <nova:port uuid="598d7930-e98a-4d8d-b339-a3edf37f15dd">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="serial">2925f68f-5cfe-47c2-b952-de9856d8ab82</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="uuid">2925f68f-5cfe-47c2-b952-de9856d8ab82</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.config"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:a2:9b:74"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <target dev="tap598d7930-e9"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/console.log" append="off"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:18:49 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:18:49 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:18:49 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:18:49 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.412 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Preparing to wait for external event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.412 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.412 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.413 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.413 182627 DEBUG nova.virt.libvirt.vif [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1179523829',display_name='tempest-ServersAdminTestJSON-server-1179523829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1179523829',id=21,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-6vn4bppz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:46Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=2925f68f-5cfe-47c2-b952-de9856d8ab82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.414 182627 DEBUG nova.network.os_vif_util [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.414 182627 DEBUG nova.network.os_vif_util [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.414 182627 DEBUG os_vif [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.415 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.415 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.416 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.419 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.419 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap598d7930-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.420 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap598d7930-e9, col_values=(('external_ids', {'iface-id': '598d7930-e98a-4d8d-b339-a3edf37f15dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:9b:74', 'vm-uuid': '2925f68f-5cfe-47c2-b952-de9856d8ab82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:18:49 np0005592767 NetworkManager[54973]: <info>  [1769120329.4259] manager: (tap598d7930-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.430 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.431 182627 INFO os_vif [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9')#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.521 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.521 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.521 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No VIF found with MAC fa:16:3e:a2:9b:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.522 182627 INFO nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Using config drive#033[00m
Jan 22 17:18:49 np0005592767 nova_compute[182623]: 2026-01-22 22:18:49.540 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:50Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:18:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:50Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:18:51 np0005592767 podman[214033]: 2026-01-22 22:18:51.150255798 +0000 UTC m=+0.063250490 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:18:51 np0005592767 nova_compute[182623]: 2026-01-22 22:18:51.697 182627 INFO nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Creating config drive at /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.config#033[00m
Jan 22 17:18:51 np0005592767 nova_compute[182623]: 2026-01-22 22:18:51.704 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpycios1_w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:18:51 np0005592767 nova_compute[182623]: 2026-01-22 22:18:51.830 182627 DEBUG oslo_concurrency.processutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpycios1_w" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:18:51 np0005592767 kernel: tap598d7930-e9: entered promiscuous mode
Jan 22 17:18:51 np0005592767 NetworkManager[54973]: <info>  [1769120331.8823] manager: (tap598d7930-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 22 17:18:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:51Z|00088|binding|INFO|Claiming lport 598d7930-e98a-4d8d-b339-a3edf37f15dd for this chassis.
Jan 22 17:18:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:51Z|00089|binding|INFO|598d7930-e98a-4d8d-b339-a3edf37f15dd: Claiming fa:16:3e:a2:9b:74 10.100.0.7
Jan 22 17:18:51 np0005592767 nova_compute[182623]: 2026-01-22 22:18:51.928 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:51Z|00090|binding|INFO|Setting lport 598d7930-e98a-4d8d-b339-a3edf37f15dd ovn-installed in OVS
Jan 22 17:18:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:18:51Z|00091|binding|INFO|Setting lport 598d7930-e98a-4d8d-b339-a3edf37f15dd up in Southbound
Jan 22 17:18:51 np0005592767 nova_compute[182623]: 2026-01-22 22:18:51.941 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:51.940 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:9b:74 10.100.0.7'], port_security=['fa:16:3e:a2:9b:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2925f68f-5cfe-47c2-b952-de9856d8ab82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=598d7930-e98a-4d8d-b339-a3edf37f15dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:18:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:51.944 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 598d7930-e98a-4d8d-b339-a3edf37f15dd in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c bound to our chassis#033[00m
Jan 22 17:18:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:51.946 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:18:51 np0005592767 systemd-machined[153912]: New machine qemu-10-instance-00000015.
Jan 22 17:18:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:51.962 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ad207a-d13a-43c8-a76c-4cc8e2da7d94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:51 np0005592767 systemd[1]: Started Virtual Machine qemu-10-instance-00000015.
Jan 22 17:18:51 np0005592767 systemd-udevd[214073]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:18:51 np0005592767 NetworkManager[54973]: <info>  [1769120331.9876] device (tap598d7930-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:18:51 np0005592767 NetworkManager[54973]: <info>  [1769120331.9881] device (tap598d7930-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:51.999 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b078b9e4-1458-44f0-91a3-6f1bf27dfcdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.004 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a6192376-ee3a-45d9-9ae2-576c816041da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.029 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0e678c6c-e88b-4496-9a4e-deef96ccd7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.050 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a6df3b-d884-41e2-8f48-a5bbffea3cda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214084, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e23b4f44-c201-4407-9925-2801254bad70]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214086, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214086, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.071 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.075 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:18:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:18:52.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.586 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120332.5863776, 2925f68f-5cfe-47c2-b952-de9856d8ab82 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.587 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] VM Started (Lifecycle Event)#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.609 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.614 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120332.5874553, 2925f68f-5cfe-47c2-b952-de9856d8ab82 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.615 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.636 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.641 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:52 np0005592767 nova_compute[182623]: 2026-01-22 22:18:52.658 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.429 182627 DEBUG nova.compute.manager [req-ec125aba-0e63-4ec5-91a8-dc2070f2a05a req-00a0288c-4838-4c52-ad10-27749bc56e09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.430 182627 DEBUG oslo_concurrency.lockutils [req-ec125aba-0e63-4ec5-91a8-dc2070f2a05a req-00a0288c-4838-4c52-ad10-27749bc56e09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.430 182627 DEBUG oslo_concurrency.lockutils [req-ec125aba-0e63-4ec5-91a8-dc2070f2a05a req-00a0288c-4838-4c52-ad10-27749bc56e09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.430 182627 DEBUG oslo_concurrency.lockutils [req-ec125aba-0e63-4ec5-91a8-dc2070f2a05a req-00a0288c-4838-4c52-ad10-27749bc56e09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.431 182627 DEBUG nova.compute.manager [req-ec125aba-0e63-4ec5-91a8-dc2070f2a05a req-00a0288c-4838-4c52-ad10-27749bc56e09 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Processing event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.432 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.435 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120333.43505, 2925f68f-5cfe-47c2-b952-de9856d8ab82 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.435 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.437 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.441 182627 INFO nova.virt.libvirt.driver [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Instance spawned successfully.#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.441 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.478 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.485 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.486 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.487 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.487 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.488 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.488 182627 DEBUG nova.virt.libvirt.driver [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.491 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.537 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.577 182627 INFO nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Took 7.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.578 182627 DEBUG nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.629 182627 DEBUG nova.network.neutron [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Updated VIF entry in instance network info cache for port 598d7930-e98a-4d8d-b339-a3edf37f15dd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.630 182627 DEBUG nova.network.neutron [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Updating instance_info_cache with network_info: [{"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.642 182627 DEBUG oslo_concurrency.lockutils [req-e3a3fbee-3718-4e9e-9ce7-70f426ae3bb0 req-8c2cbf54-2516-45d0-9253-fb786771417d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2925f68f-5cfe-47c2-b952-de9856d8ab82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.652 182627 INFO nova.compute.manager [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Took 7.85 seconds to build instance.#033[00m
Jan 22 17:18:53 np0005592767 nova_compute[182623]: 2026-01-22 22:18:53.667 182627 DEBUG oslo_concurrency.lockutils [None req-48ade924-a6cc-412c-bc9e-3be0950dde3e f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.962s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:54 np0005592767 podman[214095]: 2026-01-22 22:18:54.164059258 +0000 UTC m=+0.071744906 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64)
Jan 22 17:18:54 np0005592767 podman[214094]: 2026-01-22 22:18:54.254475472 +0000 UTC m=+0.158302128 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:18:54 np0005592767 nova_compute[182623]: 2026-01-22 22:18:54.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:54 np0005592767 nova_compute[182623]: 2026-01-22 22:18:54.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.654 182627 DEBUG nova.compute.manager [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.654 182627 DEBUG oslo_concurrency.lockutils [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.655 182627 DEBUG oslo_concurrency.lockutils [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.655 182627 DEBUG oslo_concurrency.lockutils [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.656 182627 DEBUG nova.compute.manager [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] No waiting events found dispatching network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:18:55 np0005592767 nova_compute[182623]: 2026-01-22 22:18:55.656 182627 WARNING nova.compute.manager [req-f6a7ab92-c672-4ee6-829c-488d4c2ad494 req-fbb88924-1ca6-4d15-bd04-cb1f35807d69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received unexpected event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd for instance with vm_state active and task_state None.#033[00m
Jan 22 17:18:59 np0005592767 nova_compute[182623]: 2026-01-22 22:18:59.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:18:59 np0005592767 nova_compute[182623]: 2026-01-22 22:18:59.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:00 np0005592767 podman[214139]: 2026-01-22 22:19:00.129220912 +0000 UTC m=+0.047823683 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:19:02 np0005592767 podman[214158]: 2026-01-22 22:19:02.176607037 +0000 UTC m=+0.056163355 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:19:04 np0005592767 nova_compute[182623]: 2026-01-22 22:19:04.086 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating tmpfile /var/lib/nova/instances/tmp6c7184m7 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 22 17:19:04 np0005592767 nova_compute[182623]: 2026-01-22 22:19:04.088 182627 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 22 17:19:04 np0005592767 nova_compute[182623]: 2026-01-22 22:19:04.176 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:04 np0005592767 nova_compute[182623]: 2026-01-22 22:19:04.427 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:04 np0005592767 nova_compute[182623]: 2026-01-22 22:19:04.546 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.551 182627 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.590 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.591 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.591 182627 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.911 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:05 np0005592767 nova_compute[182623]: 2026-01-22 22:19:05.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.019 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.021 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.022 182627 INFO nova.compute.manager [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Unshelving#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.153 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.154 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.159 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'pci_requests' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.176 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'numa_topology' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.210 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.211 182627 INFO nova.compute.claims [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.403 182627 DEBUG nova.compute.provider_tree [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.417 182627 DEBUG nova.scheduler.client.report [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.439 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.690 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.691 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquired lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.691 182627 DEBUG nova.network.neutron [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:19:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:06Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:9b:74 10.100.0.7
Jan 22 17:19:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:06Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:9b:74 10.100.0.7
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.882 182627 DEBUG nova.network.neutron [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.908 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:06 np0005592767 nova_compute[182623]: 2026-01-22 22:19:06.909 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.170 182627 DEBUG nova.network.neutron [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.187 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Releasing lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.189 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.190 182627 INFO nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Creating image(s)#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.190 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.191 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.192 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.192 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.209 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.210 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.416 182627 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.433 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.443 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.444 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating instance directory: /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.444 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Creating disk.info with the contents: {'/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk': 'qcow2', '/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.444 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.445 182627 DEBUG nova.objects.instance [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'trusted_certs' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.472 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.540 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.541 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.542 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.557 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.619 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.620 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.654 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.655 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.655 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.713 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.715 182627 DEBUG nova.virt.disk.api [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Checking if we can resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.715 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.779 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.780 182627 DEBUG nova.virt.disk.api [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Cannot resize image /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.781 182627 DEBUG nova.objects.instance [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.858 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.882 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.884 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config to /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.884 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.901 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.902 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:19:07 np0005592767 nova_compute[182623]: 2026-01-22 22:19:07.902 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.105 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.106 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.106 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.106 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.433 182627 DEBUG oslo_concurrency.processutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk.config /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.434 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.435 182627 DEBUG nova.virt.libvirt.vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:18:59Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.436 182627 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.437 182627 DEBUG nova.network.os_vif_util [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.437 182627 DEBUG os_vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.439 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.440 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.497 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.498 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap580dc508-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.498 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap580dc508-63, col_values=(('external_ids', {'iface-id': '580dc508-636a-420e-aed2-8efd9dccace5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:a3:f5', 'vm-uuid': '469eaf2b-7d53-40c9-a233-b27d702a21ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:08 np0005592767 NetworkManager[54973]: <info>  [1769120348.5012] manager: (tap580dc508-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.501 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.507 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.508 182627 INFO os_vif [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.508 182627 DEBUG nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 22 17:19:08 np0005592767 nova_compute[182623]: 2026-01-22 22:19:08.509 182627 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 22 17:19:09 np0005592767 podman[214221]: 2026-01-22 22:19:09.152120658 +0000 UTC m=+0.063479176 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.547 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.581 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.635 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.part --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.636 182627 DEBUG nova.virt.images [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] 56588cde-7349-4577-b113-fd627f3712f4 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.637 182627 DEBUG nova.privsep.utils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.638 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.part /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:09 np0005592767 nova_compute[182623]: 2026-01-22 22:19:09.990 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.part /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.converted" returned: 0 in 0.352s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.000 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.053 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32.converted --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.054 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.067 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.124 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.125 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.126 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.141 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.197 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.198 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32,backing_fmt=raw /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.229 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32,backing_fmt=raw /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.230 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.230 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.290 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.291 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'migration_context' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.305 182627 INFO nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Rebasing disk image.#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.305 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.367 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.368 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e -F raw /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.528 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updating instance_info_cache with network_info: [{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.558 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-e6db2ef0-a660-4d03-8a2d-9574e7af17d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.559 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.559 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.560 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.560 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.792 182627 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Port 580dc508-636a-420e-aed2-8efd9dccace5 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.806 182627 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6c7184m7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.906 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.907 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.934 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.935 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.935 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:10 np0005592767 nova_compute[182623]: 2026-01-22 22:19:10.935 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:19:11 np0005592767 kernel: tap580dc508-63: entered promiscuous mode
Jan 22 17:19:11 np0005592767 NetworkManager[54973]: <info>  [1769120351.1485] manager: (tap580dc508-63): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.149 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:11Z|00092|binding|INFO|Claiming lport 580dc508-636a-420e-aed2-8efd9dccace5 for this additional chassis.
Jan 22 17:19:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:11Z|00093|binding|INFO|580dc508-636a-420e-aed2-8efd9dccace5: Claiming fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 17:19:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:11Z|00094|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 ovn-installed in OVS
Jan 22 17:19:11 np0005592767 systemd-udevd[214290]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:11 np0005592767 NetworkManager[54973]: <info>  [1769120351.3025] device (tap580dc508-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:19:11 np0005592767 NetworkManager[54973]: <info>  [1769120351.3036] device (tap580dc508-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:19:11 np0005592767 systemd-machined[153912]: New machine qemu-11-instance-00000016.
Jan 22 17:19:11 np0005592767 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.550 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.650 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.650 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.722 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.728 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.887 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:11 np0005592767 nova_compute[182623]: 2026-01-22 22:19:11.888 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.025 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.032 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.088 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.089 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:12.091 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:12.092 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:12.093 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.150 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.434 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.436 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5386MB free_disk=73.23805618286133GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.436 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.436 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.487 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration for instance 469eaf2b-7d53-40c9-a233-b27d702a21ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.507 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating resource usage from migration d66e6672-9239-4702-afa5-407b94c993b2#033[00m
Jan 22 17:19:12 np0005592767 nova_compute[182623]: 2026-01-22 22:19:12.508 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Starting to track incoming migration d66e6672-9239-4702-afa5-407b94c993b2 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.214 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance e6db2ef0-a660-4d03-8a2d-9574e7af17d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.214 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 2925f68f-5cfe-47c2-b952-de9856d8ab82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.239 182627 WARNING nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 469eaf2b-7d53-40c9-a233-b27d702a21ed has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.239 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 2fa0a577-f149-488d-8c47-6dfa4ca56c67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.240 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.240 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.436 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.454 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.473 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.473 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.474 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.474 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.491 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:19:13 np0005592767 nova_compute[182623]: 2026-01-22 22:19:13.501 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.064 182627 INFO nova.compute.manager [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Rebuilding instance#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.363 182627 DEBUG nova.compute.manager [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.424 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_requests' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.444 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_devices' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.455 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'resources' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.463 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'migration_context' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.471 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.474 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.552 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.674 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120354.6738327, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.674 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Started (Lifecycle Event)#033[00m
Jan 22 17:19:14 np0005592767 nova_compute[182623]: 2026-01-22 22:19:14.702 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.539 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e -F raw /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk" returned: 0 in 5.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.540 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.540 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Ensure instance console log exists: /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.541 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.541 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.541 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.543 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c05643880ac3f99fc0be73afa9b00e7d',container_format='bare',created_at=2026-01-22T22:18:44Z,direct_url=<?>,disk_format='qcow2',id=56588cde-7349-4577-b113-fd627f3712f4,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2088756578-shelved',owner='308077846c7f447188b9270a5844b0b1',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-22T22:19:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.547 182627 WARNING nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.551 182627 DEBUG nova.virt.libvirt.host [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.551 182627 DEBUG nova.virt.libvirt.host [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.554 182627 DEBUG nova.virt.libvirt.host [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.555 182627 DEBUG nova.virt.libvirt.host [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.556 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.556 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c05643880ac3f99fc0be73afa9b00e7d',container_format='bare',created_at=2026-01-22T22:18:44Z,direct_url=<?>,disk_format='qcow2',id=56588cde-7349-4577-b113-fd627f3712f4,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2088756578-shelved',owner='308077846c7f447188b9270a5844b0b1',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-22T22:19:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.556 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.557 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.557 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.557 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.557 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.558 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.558 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.558 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.558 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.559 182627 DEBUG nova.virt.hardware [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.559 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.582 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'pci_devices' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.594 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <uuid>2fa0a577-f149-488d-8c47-6dfa4ca56c67</uuid>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <name>instance-00000011</name>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-2088756578</nova:name>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:19:15</nova:creationTime>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:user uuid="00c21612db1c4c16a7cf8957fc8ca445">tempest-UnshelveToHostMultiNodesTest-1743096818-project-member</nova:user>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:        <nova:project uuid="308077846c7f447188b9270a5844b0b1">tempest-UnshelveToHostMultiNodesTest-1743096818</nova:project>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="56588cde-7349-4577-b113-fd627f3712f4"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="serial">2fa0a577-f149-488d-8c47-6dfa4ca56c67</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="uuid">2fa0a577-f149-488d-8c47-6dfa4ca56c67</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.config"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/console.log" append="off"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <input type="keyboard" bus="usb"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:19:15 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:19:15 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:19:15 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:19:15 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.661 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.661 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.662 182627 INFO nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Using config drive#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.675 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.737 182627 DEBUG nova.objects.instance [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lazy-loading 'keypairs' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.815 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120355.815485, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.816 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.858 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.861 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:15 np0005592767 nova_compute[182623]: 2026-01-22 22:19:15.892 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.031 182627 INFO nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Creating config drive at /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.config#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.037 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5c8m4din execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.161 182627 DEBUG oslo_concurrency.processutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5c8m4din" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:16 np0005592767 systemd-machined[153912]: New machine qemu-12-instance-00000011.
Jan 22 17:19:16 np0005592767 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.717 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120356.7167153, 2fa0a577-f149-488d-8c47-6dfa4ca56c67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.718 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.722 182627 DEBUG nova.compute.manager [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.722 182627 DEBUG nova.virt.libvirt.driver [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.726 182627 INFO nova.virt.libvirt.driver [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance spawned successfully.#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.751 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.755 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:16 np0005592767 kernel: tap648c69ef-5b (unregistering): left promiscuous mode
Jan 22 17:19:16 np0005592767 NetworkManager[54973]: <info>  [1769120356.7643] device (tap648c69ef-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00095|binding|INFO|Releasing lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 from this chassis (sb_readonly=0)
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00096|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 down in Southbound
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00097|binding|INFO|Removing iface tap648c69ef-5b ovn-installed in OVS
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.785 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.785 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120356.7181716, 2fa0a577-f149-488d-8c47-6dfa4ca56c67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.786 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] VM Started (Lifecycle Event)#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.790 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.793 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c unbound from our chassis#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.796 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.806 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.810 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.819 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[78a982da-0988-4bd3-8162-b5d1082f60ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 22 17:19:16 np0005592767 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000012.scope: Consumed 13.757s CPU time.
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.825 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:19:16 np0005592767 systemd-machined[153912]: Machine qemu-9-instance-00000012 terminated.
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.861 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ebbe63-da8d-4c33-8d60-614158e9b812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.865 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f07e1845-07ce-45fb-bd9b-fbc8956a896d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.887 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[02442805-c747-42d5-b8fe-798864e18c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00098|binding|INFO|Claiming lport 580dc508-636a-420e-aed2-8efd9dccace5 for this chassis.
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00099|binding|INFO|580dc508-636a-420e-aed2-8efd9dccace5: Claiming fa:16:3e:01:a3:f5 10.100.0.6
Jan 22 17:19:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:16Z|00100|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 up in Southbound
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.908 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.911 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d1cd77e9-a82e-4846-a0fd-52c6e68a052f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214383, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.929 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a21d241-2944-48a8-ad5e-e3ada3600955]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214384, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214384, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.931 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.932 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:16 np0005592767 nova_compute[182623]: 2026-01-22 22:19:16.937 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.938 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.938 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.938 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.939 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.941 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea bound to our chassis#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.943 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.959 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fb22fc-b6ba-4468-b9f5-200f7165b75a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.961 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap698e77c5-f1 in ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.963 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap698e77c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.963 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[63f5c504-a407-494f-b043-fb7d5dc7940b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.964 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d228828a-55cf-445b-a8c1-5b102b0f1085]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.978 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7f844f-37f0-4c91-b025-8acaf6248348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:16.993 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[854d169f-95be-481c-a75e-73a613d280eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.044 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[72998982-66a6-417b-b2b6-4a77533a070a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.053 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[be58e8d5-cfa5-4452-bd53-32daf042df78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 NetworkManager[54973]: <info>  [1769120357.0551] manager: (tap698e77c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 22 17:19:17 np0005592767 systemd-udevd[214371]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.092 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e276fbc1-14e8-48ec-9b02-69c9c1a1ee71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.096 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ef593e-4280-438d-8cd9-655cd8da6cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 NetworkManager[54973]: <info>  [1769120357.1158] device (tap698e77c5-f0): carrier: link connected
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.120 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[18f247b9-e23e-468f-86aa-9cf39e700d67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.137 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[31a8c302-d6ec-4757-87bd-4dbf2938516d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395369, 'reachable_time': 34040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214431, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.154 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ba673097-ef62-4234-8bcd-41ec1b075797]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:3733'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395369, 'tstamp': 395369}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214432, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.170 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[993e33a0-2319-4b06-a08d-f0a4a1f31974]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395369, 'reachable_time': 34040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214433, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.199 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[839440be-3b03-46bf-bc3a-b4e3750e396b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.255 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1a283905-c152-4614-9846-079ccc5254ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.257 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.257 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.257 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.260 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 NetworkManager[54973]: <info>  [1769120357.2608] manager: (tap698e77c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 22 17:19:17 np0005592767 kernel: tap698e77c5-f0: entered promiscuous mode
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.269 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.274 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:17Z|00101|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.275 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.289 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.294 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.295 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.296 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5649299c-f4a9-4e27-a38d-6ad7196e38f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.297 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:19:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:17.298 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'env', 'PROCESS_TAG=haproxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/698e77c5-fce6-47a5-b6e3-f4c56da226ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.460 182627 INFO nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Post operation of migration started#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.533 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance shutdown successfully after 3 seconds.#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.542 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance destroyed successfully.#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.546 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance destroyed successfully.#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.547 182627 DEBUG nova.virt.libvirt.vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:13Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.547 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.548 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.548 182627 DEBUG os_vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.550 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.551 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap648c69ef-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.552 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.554 182627 DEBUG nova.compute.manager [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.554 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.558 182627 INFO os_vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.558 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deleting instance files /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.559 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deletion of /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del complete#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.693 182627 DEBUG oslo_concurrency.lockutils [None req-13b2f2ed-6bd4-4d7e-aaa1-35c5d44edec3 8b4b664c607b420099bf68323b0ada94 fe4a1554e267466997fcad4d5b6b270a - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:17 np0005592767 podman[214469]: 2026-01-22 22:19:17.717303717 +0000 UTC m=+0.055361912 container create 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:19:17 np0005592767 podman[214469]: 2026-01-22 22:19:17.69043562 +0000 UTC m=+0.028493835 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.815 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.815 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.815 182627 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:19:17 np0005592767 systemd[1]: Started libpod-conmon-491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce.scope.
Jan 22 17:19:17 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:19:17 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d141024a03b6268a5c41a7ee2300f779c21fcaab2d181ea6e04b06ef461296/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:19:17 np0005592767 podman[214469]: 2026-01-22 22:19:17.887072186 +0000 UTC m=+0.225130401 container init 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:19:17 np0005592767 podman[214469]: 2026-01-22 22:19:17.895131049 +0000 UTC m=+0.233189244 container start 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:19:17 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [NOTICE]   (214489) : New worker (214491) forked
Jan 22 17:19:17 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [NOTICE]   (214489) : Loading success.
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.968 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.969 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating image(s)#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.969 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.969 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.970 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.970 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:17 np0005592767 nova_compute[182623]: 2026-01-22 22:19:17.971 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.458 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.539 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.part --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.541 182627 DEBUG nova.virt.images [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] 8bcaf91e-26cd-4687-9abd-8185bd0c5241 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.543 182627 DEBUG nova.privsep.utils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.543 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.part /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.560 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.726 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.part /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.converted" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.731 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.794 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c.converted --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.795 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.812 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.861 182627 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.861 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.862 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.862 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.862 182627 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.863 182627 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state error and task_state rebuild_spawning.#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.863 182627 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.863 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.863 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.864 182627 DEBUG oslo_concurrency.lockutils [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.864 182627 DEBUG nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.864 182627 WARNING nova.compute.manager [req-29470626-f91c-471c-b624-3d0a94717cfa req-424ffce6-408a-428a-a14f-6f0a340c0bd5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state error and task_state rebuild_spawning.#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.869 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.870 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.870 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.885 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.948 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.949 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.984 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.985 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:19 np0005592767 nova_compute[182623]: 2026-01-22 22:19:19.985 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.003 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Acquiring lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.004 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.004 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Acquiring lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.005 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.005 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.015 182627 INFO nova.compute.manager [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Terminating instance#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.026 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Acquiring lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.026 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Acquired lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.026 182627 DEBUG nova.network.neutron [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.041 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.041 182627 DEBUG nova.virt.disk.api [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Checking if we can resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.042 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.111 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.112 182627 DEBUG nova.virt.disk.api [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Cannot resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.112 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.112 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Ensure instance console log exists: /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.113 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.113 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.113 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.116 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start _get_guest_xml network_info=[{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.121 182627 WARNING nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.127 182627 DEBUG nova.virt.libvirt.host [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.128 182627 DEBUG nova.virt.libvirt.host [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.131 182627 DEBUG nova.virt.libvirt.host [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.131 182627 DEBUG nova.virt.libvirt.host [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.132 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.133 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.133 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.133 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.133 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.134 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.135 182627 DEBUG nova.virt.hardware [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.135 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'vcpu_model' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.152 182627 DEBUG nova.virt.libvirt.vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:17Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.153 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.153 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.155 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <uuid>e6db2ef0-a660-4d03-8a2d-9574e7af17d4</uuid>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <name>instance-00000012</name>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersAdminTestJSON-server-1481617455</nova:name>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:19:20</nova:creationTime>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:user uuid="f23ea0c335b84bd2b78725d5a5491d0a">tempest-ServersAdminTestJSON-1825362070-project-member</nova:user>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:project uuid="214876cdc63543458d35ee214fe21b2c">tempest-ServersAdminTestJSON-1825362070</nova:project>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        <nova:port uuid="648c69ef-5bab-43c9-99a7-4b49b3122d56">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="serial">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="uuid">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:ea:a6:7a"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <target dev="tap648c69ef-5b"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log" append="off"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:19:20 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:19:20 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:19:20 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:19:20 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.155 182627 DEBUG nova.virt.libvirt.vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:17Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.156 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.156 182627 DEBUG nova.network.os_vif_util [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.156 182627 DEBUG os_vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.157 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.157 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.158 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.160 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.160 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap648c69ef-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.160 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap648c69ef-5b, col_values=(('external_ids', {'iface-id': '648c69ef-5bab-43c9-99a7-4b49b3122d56', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:a6:7a', 'vm-uuid': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:20 np0005592767 NetworkManager[54973]: <info>  [1769120360.1631] manager: (tap648c69ef-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.165 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.169 182627 INFO os_vif [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.233 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.233 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.234 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No VIF found with MAC fa:16:3e:ea:a6:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.234 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Using config drive#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.247 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'ec2_ids' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.277 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'keypairs' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.701 182627 DEBUG nova.network.neutron [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.888 182627 INFO nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating config drive at /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config#033[00m
Jan 22 17:19:20 np0005592767 nova_compute[182623]: 2026-01-22 22:19:20.894 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcn3x7s9p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.014 182627 DEBUG nova.network.neutron [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.024 182627 DEBUG oslo_concurrency.processutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcn3x7s9p" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.031 182627 DEBUG nova.network.neutron [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.045 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Releasing lock "refresh_cache-2fa0a577-f149-488d-8c47-6dfa4ca56c67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.046 182627 DEBUG nova.compute.manager [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.066 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:21 np0005592767 kernel: tap648c69ef-5b: entered promiscuous mode
Jan 22 17:19:21 np0005592767 NetworkManager[54973]: <info>  [1769120361.0923] manager: (tap648c69ef-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.095 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.096 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.096 182627 DEBUG oslo_concurrency.lockutils [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:21Z|00102|binding|INFO|Claiming lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 for this chassis.
Jan 22 17:19:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:21Z|00103|binding|INFO|648c69ef-5bab-43c9-99a7-4b49b3122d56: Claiming fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.097 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:21 np0005592767 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 22 17:19:21 np0005592767 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 4.924s CPU time.
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.103 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.104 182627 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.104 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c bound to our chassis#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.106 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:19:21 np0005592767 systemd-machined[153912]: Machine qemu-12-instance-00000011 terminated.
Jan 22 17:19:21 np0005592767 virtqemud[182095]: Domain id=11 name='instance-00000016' uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed is tainted: custom-monitor
Jan 22 17:19:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:21Z|00104|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 ovn-installed in OVS
Jan 22 17:19:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:21Z|00105|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 up in Southbound
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:21 np0005592767 systemd-udevd[214548]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.125 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3efa33-a568-4219-8949-8f4a6f494e2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 NetworkManager[54973]: <info>  [1769120361.1339] device (tap648c69ef-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:19:21 np0005592767 NetworkManager[54973]: <info>  [1769120361.1348] device (tap648c69ef-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:19:21 np0005592767 systemd-machined[153912]: New machine qemu-13-instance-00000012.
Jan 22 17:19:21 np0005592767 systemd[1]: Started Virtual Machine qemu-13-instance-00000012.
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.156 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e00bc6de-f719-4e91-a6ab-6a36800da937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.160 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[42b10af5-39f8-4b1c-804e-474869c3693d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.188 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5f477d28-f9c4-4fac-8ff8-a87d8877ed5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.207 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c9fdaeb8-fd09-48a0-8dfe-4b8d3ffce116]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214572, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.226 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3cda8a-6287-4a67-a98a-92468d616a0c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214577, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214577, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.228 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.231 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.231 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.231 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.232 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:21.232 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:21 np0005592767 podman[214555]: 2026-01-22 22:19:21.240351083 +0000 UTC m=+0.063189958 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.300 182627 INFO nova.virt.libvirt.driver [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance destroyed successfully.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.300 182627 DEBUG nova.objects.instance [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lazy-loading 'resources' on Instance uuid 2fa0a577-f149-488d-8c47-6dfa4ca56c67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.319 182627 INFO nova.virt.libvirt.driver [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Deleting instance files /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67_del#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.324 182627 INFO nova.virt.libvirt.driver [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Deletion of /var/lib/nova/instances/2fa0a577-f149-488d-8c47-6dfa4ca56c67_del complete#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.408 182627 INFO nova.compute.manager [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.409 182627 DEBUG oslo.service.loopingcall [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.409 182627 DEBUG nova.compute.manager [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.409 182627 DEBUG nova.network.neutron [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.698 182627 DEBUG nova.network.neutron [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.728 182627 DEBUG nova.network.neutron [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.737 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for e6db2ef0-a660-4d03-8a2d-9574e7af17d4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.738 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120361.737577, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.738 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.740 182627 DEBUG nova.compute.manager [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.740 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.744 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance spawned successfully.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.744 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.767 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.770 182627 INFO nova.compute.manager [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Took 0.36 seconds to deallocate network for instance.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.774 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.780 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.780 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.781 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.781 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.782 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.782 182627 DEBUG nova.virt.libvirt.driver [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.827 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.828 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120361.7394085, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.828 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.863 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.867 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.926 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.928 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.928 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:21 np0005592767 nova_compute[182623]: 2026-01-22 22:19:21.933 182627 DEBUG nova.compute.manager [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.055 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.088 182627 DEBUG nova.compute.manager [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.089 182627 DEBUG oslo_concurrency.lockutils [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.089 182627 DEBUG oslo_concurrency.lockutils [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.089 182627 DEBUG oslo_concurrency.lockutils [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.090 182627 DEBUG nova.compute.manager [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.090 182627 WARNING nova.compute.manager [req-87459a7f-0dc8-48d4-84cf-0585d68fd502 req-1152e1bc-d467-47c6-81b0-fb931c48273d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.099 182627 DEBUG nova.compute.provider_tree [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.115 182627 DEBUG nova.scheduler.client.report [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.123 182627 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.149 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.152 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.152 182627 DEBUG nova.objects.instance [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.189 182627 INFO nova.scheduler.client.report [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Deleted allocations for instance 2fa0a577-f149-488d-8c47-6dfa4ca56c67#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.311 182627 DEBUG oslo_concurrency.lockutils [None req-a7489ea0-f6a2-46eb-99cb-207282cb9f16 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:22 np0005592767 nova_compute[182623]: 2026-01-22 22:19:22.348 182627 DEBUG oslo_concurrency.lockutils [None req-f562e35b-3da2-43a8-8bb9-6d93de6b52b0 00c21612db1c4c16a7cf8957fc8ca445 308077846c7f447188b9270a5844b0b1 - - default default] Lock "2fa0a577-f149-488d-8c47-6dfa4ca56c67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:23 np0005592767 nova_compute[182623]: 2026-01-22 22:19:23.130 182627 INFO nova.virt.libvirt.driver [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 22 17:19:23 np0005592767 nova_compute[182623]: 2026-01-22 22:19:23.137 182627 DEBUG nova.compute.manager [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:23 np0005592767 nova_compute[182623]: 2026-01-22 22:19:23.233 182627 DEBUG nova.objects.instance [None req-95ff16cd-6c97-4b1e-bafb-06c89b69274a ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.221 182627 DEBUG nova.compute.manager [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.223 182627 DEBUG oslo_concurrency.lockutils [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.223 182627 DEBUG oslo_concurrency.lockutils [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.223 182627 DEBUG oslo_concurrency.lockutils [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.223 182627 DEBUG nova.compute.manager [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.224 182627 WARNING nova.compute.manager [req-882d143a-c248-47cd-89ba-9abb9a91ddeb req-e62a7794-e62a-4f98-aab5-f4924fcdfa61 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:19:24 np0005592767 nova_compute[182623]: 2026-01-22 22:19:24.556 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:25 np0005592767 nova_compute[182623]: 2026-01-22 22:19:25.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:25 np0005592767 podman[214598]: 2026-01-22 22:19:25.181364895 +0000 UTC m=+0.087002186 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container)
Jan 22 17:19:25 np0005592767 podman[214597]: 2026-01-22 22:19:25.207057358 +0000 UTC m=+0.111928357 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.059 182627 INFO nova.compute.manager [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Rebuilding instance#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.403 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.403 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.416 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.428 182627 DEBUG nova.compute.manager [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.564 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_requests' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.571 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.572 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.579 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.579 182627 INFO nova.compute.claims [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.582 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'pci_devices' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.612 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'resources' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.639 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'migration_context' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.666 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.674 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.758 182627 DEBUG nova.compute.provider_tree [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.775 182627 DEBUG nova.scheduler.client.report [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.805 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.867 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.868 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.894 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] No node specified, defaulting to compute-2.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.941 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "37c44a11-c0a4-4e9d-b692-bf2463a4d6cf" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:26 np0005592767 nova_compute[182623]: 2026-01-22 22:19:26.942 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.022 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.022 182627 DEBUG nova.network.neutron [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.042 182627 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.062 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.222 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.223 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.224 182627 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Creating image(s)#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.224 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.225 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.225 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.242 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.322 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.323 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.324 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.335 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.389 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.390 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.424 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.425 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.426 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.483 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.484 182627 DEBUG nova.virt.disk.api [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Checking if we can resize image /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.485 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.543 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.544 182627 DEBUG nova.virt.disk.api [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Cannot resize image /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.545 182627 DEBUG nova.objects.instance [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'migration_context' on Instance uuid d3c0bb3e-4a49-475d-aa25-b992d112af28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.572 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.572 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Ensure instance console log exists: /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.573 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.573 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.574 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.585 182627 DEBUG nova.network.neutron [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.585 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.587 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.593 182627 WARNING nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.600 182627 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.601 182627 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.604 182627 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.604 182627 DEBUG nova.virt.libvirt.host [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.606 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.606 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.607 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.607 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.607 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.607 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.608 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.608 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.608 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.608 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.608 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.609 182627 DEBUG nova.virt.hardware [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.613 182627 DEBUG nova.objects.instance [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'pci_devices' on Instance uuid d3c0bb3e-4a49-475d-aa25-b992d112af28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.634 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <uuid>d3c0bb3e-4a49-475d-aa25-b992d112af28</uuid>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <name>instance-0000001c</name>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersOnMultiNodesTest-server-975595322-2</nova:name>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:19:27</nova:creationTime>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:user uuid="7c7976c0d8814435b29d032d44312d82">tempest-ServersOnMultiNodesTest-1342288026-project-member</nova:user>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:        <nova:project uuid="bb26d5e006aa4c1a8f553f412a76778a">tempest-ServersOnMultiNodesTest-1342288026</nova:project>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="serial">d3c0bb3e-4a49-475d-aa25-b992d112af28</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="uuid">d3c0bb3e-4a49-475d-aa25-b992d112af28</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.config"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/console.log" append="off"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:19:27 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:19:27 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:19:27 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:19:27 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.706 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.707 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.707 182627 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Using config drive#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.727 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Check if temp file /var/lib/nova/instances/tmpyr59vjpp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.742 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.833 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.835 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.898 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:27 np0005592767 nova_compute[182623]: 2026-01-22 22:19:27.901 182627 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.229 182627 INFO nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Creating config drive at /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.config#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.235 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gufl723 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.358 182627 DEBUG oslo_concurrency.processutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gufl723" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:28 np0005592767 systemd-machined[153912]: New machine qemu-14-instance-0000001c.
Jan 22 17:19:28 np0005592767 systemd[1]: Started Virtual Machine qemu-14-instance-0000001c.
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.673 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.674 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.676 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120368.6759508, d3c0bb3e-4a49-475d-aa25-b992d112af28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.677 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.684 182627 INFO nova.virt.libvirt.driver [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance spawned successfully.#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.684 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.713 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.759 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.761 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.761 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.762 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.762 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.762 182627 DEBUG nova.virt.libvirt.driver [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.766 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.806 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.807 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120368.6760907, d3c0bb3e-4a49-475d-aa25-b992d112af28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.807 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] VM Started (Lifecycle Event)#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.833 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.839 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.859 182627 INFO nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Took 1.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.860 182627 DEBUG nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.867 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.956 182627 INFO nova.compute.manager [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Took 2.42 seconds to build instance.#033[00m
Jan 22 17:19:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:28.971 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:28 np0005592767 nova_compute[182623]: 2026-01-22 22:19:28.972 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:28.973 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.013 182627 DEBUG oslo_concurrency.lockutils [None req-166bf966-15a5-4121-9ce9-0b2ba7591f97 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.203 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.256 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.257 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.317 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:29 np0005592767 nova_compute[182623]: 2026-01-22 22:19:29.558 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:30 np0005592767 nova_compute[182623]: 2026-01-22 22:19:30.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:31 np0005592767 podman[214697]: 2026-01-22 22:19:31.159956189 +0000 UTC m=+0.075516115 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 17:19:32 np0005592767 systemd-logind[802]: New session 29 of user nova.
Jan 22 17:19:32 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:19:32 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:19:32 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:19:32 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:19:32 np0005592767 systemd[214722]: Queued start job for default target Main User Target.
Jan 22 17:19:32 np0005592767 systemd[214722]: Created slice User Application Slice.
Jan 22 17:19:32 np0005592767 systemd[214722]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:19:32 np0005592767 systemd[214722]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:19:32 np0005592767 systemd[214722]: Reached target Paths.
Jan 22 17:19:32 np0005592767 systemd[214722]: Reached target Timers.
Jan 22 17:19:32 np0005592767 systemd[214722]: Starting D-Bus User Message Bus Socket...
Jan 22 17:19:32 np0005592767 systemd[214722]: Starting Create User's Volatile Files and Directories...
Jan 22 17:19:32 np0005592767 systemd[214722]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:19:32 np0005592767 systemd[214722]: Reached target Sockets.
Jan 22 17:19:32 np0005592767 systemd[214722]: Finished Create User's Volatile Files and Directories.
Jan 22 17:19:32 np0005592767 systemd[214722]: Reached target Basic System.
Jan 22 17:19:32 np0005592767 systemd[214722]: Reached target Main User Target.
Jan 22 17:19:32 np0005592767 systemd[214722]: Startup finished in 147ms.
Jan 22 17:19:32 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:19:32 np0005592767 systemd[1]: Started Session 29 of User nova.
Jan 22 17:19:32 np0005592767 podman[214737]: 2026-01-22 22:19:32.330015272 +0000 UTC m=+0.052736735 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:19:32 np0005592767 systemd[1]: session-29.scope: Deactivated successfully.
Jan 22 17:19:32 np0005592767 systemd-logind[802]: Session 29 logged out. Waiting for processes to exit.
Jan 22 17:19:32 np0005592767 systemd-logind[802]: Removed session 29.
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.340 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.340 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.340 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "d3c0bb3e-4a49-475d-aa25-b992d112af28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.341 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.341 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.353 182627 INFO nova.compute.manager [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Terminating instance#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.364 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "refresh_cache-d3c0bb3e-4a49-475d-aa25-b992d112af28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.365 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquired lock "refresh_cache-d3c0bb3e-4a49-475d-aa25-b992d112af28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.365 182627 DEBUG nova.network.neutron [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.378 182627 DEBUG nova.compute.manager [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.378 182627 DEBUG oslo_concurrency.lockutils [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.379 182627 DEBUG oslo_concurrency.lockutils [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.379 182627 DEBUG oslo_concurrency.lockutils [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.379 182627 DEBUG nova.compute.manager [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.380 182627 DEBUG nova.compute.manager [req-cdb9a8e2-faa1-4a8d-bff3-caaa8766d5b4 req-947a91c8-27da-4b9b-91e2-4b97ae2fcef2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.701 182627 DEBUG nova.network.neutron [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:19:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:33Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:19:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:33Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.961 182627 INFO nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Took 4.64 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.962 182627 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:19:33 np0005592767 nova_compute[182623]: 2026-01-22 22:19:33.988 182627 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr59vjpp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='469eaf2b-7d53-40c9-a233-b27d702a21ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(4a17ec75-f718-4d4a-b545-0a0698891226),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.020 182627 DEBUG nova.objects.instance [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid 469eaf2b-7d53-40c9-a233-b27d702a21ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.022 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.024 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.024 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.042 182627 DEBUG nova.virt.libvirt.vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:23Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.043 182627 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.045 182627 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.046 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 17:19:34 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:01:a3:f5"/>
Jan 22 17:19:34 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:19:34 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:19:34 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:19:34 np0005592767 nova_compute[182623]:  <target dev="tap580dc508-63"/>
Jan 22 17:19:34 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:19:34 np0005592767 nova_compute[182623]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.046 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.073 182627 DEBUG nova.network.neutron [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.086 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Releasing lock "refresh_cache-d3c0bb3e-4a49-475d-aa25-b992d112af28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.087 182627 DEBUG nova.compute.manager [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:19:34 np0005592767 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 22 17:19:34 np0005592767 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Consumed 5.693s CPU time.
Jan 22 17:19:34 np0005592767 systemd-machined[153912]: Machine qemu-14-instance-0000001c terminated.
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.336 182627 INFO nova.virt.libvirt.driver [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance destroyed successfully.#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.337 182627 DEBUG nova.objects.instance [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lazy-loading 'resources' on Instance uuid d3c0bb3e-4a49-475d-aa25-b992d112af28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.359 182627 INFO nova.virt.libvirt.driver [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Deleting instance files /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28_del#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.360 182627 INFO nova.virt.libvirt.driver [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Deletion of /var/lib/nova/instances/d3c0bb3e-4a49-475d-aa25-b992d112af28_del complete#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.456 182627 INFO nova.compute.manager [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.457 182627 DEBUG oslo.service.loopingcall [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.457 182627 DEBUG nova.compute.manager [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.457 182627 DEBUG nova.network.neutron [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.527 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.528 182627 INFO nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.580 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.678 182627 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.679 182627 DEBUG nova.network.neutron [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.695 182627 DEBUG nova.network.neutron [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.710 182627 INFO nova.compute.manager [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Took 0.25 seconds to deallocate network for instance.#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.827 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.828 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.971 182627 DEBUG nova.compute.provider_tree [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:34 np0005592767 nova_compute[182623]: 2026-01-22 22:19:34.986 182627 DEBUG nova.scheduler.client.report [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.014 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.042 182627 INFO nova.scheduler.client.report [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Deleted allocations for instance d3c0bb3e-4a49-475d-aa25-b992d112af28#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.137 182627 DEBUG oslo_concurrency.lockutils [None req-c1dd77aa-57f8-4c78-90e0-cafdd1d28c68 7c7976c0d8814435b29d032d44312d82 bb26d5e006aa4c1a8f553f412a76778a - - default default] Lock "d3c0bb3e-4a49-475d-aa25-b992d112af28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.167 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.182 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.182 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.509 182627 DEBUG nova.compute.manager [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.510 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.510 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.511 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.511 182627 DEBUG nova.compute.manager [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.511 182627 WARNING nova.compute.manager [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.511 182627 DEBUG nova.compute.manager [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-changed-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.512 182627 DEBUG nova.compute.manager [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing instance network info cache due to event network-changed-580dc508-636a-420e-aed2-8efd9dccace5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.512 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.512 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.513 182627 DEBUG nova.network.neutron [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Refreshing network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.685 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:19:35 np0005592767 nova_compute[182623]: 2026-01-22 22:19:35.686 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 22 17:19:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:35.975 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.189 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.190 182627 DEBUG nova.virt.libvirt.migration [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.297 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120361.2969913, 2fa0a577-f149-488d-8c47-6dfa4ca56c67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.298 182627 INFO nova.compute.manager [-] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.333 182627 DEBUG nova.compute.manager [None req-e93818cb-9fcc-4953-8f50-f0dbf341017f - - - - - -] [instance: 2fa0a577-f149-488d-8c47-6dfa4ca56c67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.506 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120376.5064523, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.507 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.527 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.532 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.556 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 22 17:19:36 np0005592767 kernel: tap580dc508-63 (unregistering): left promiscuous mode
Jan 22 17:19:36 np0005592767 NetworkManager[54973]: <info>  [1769120376.6751] device (tap580dc508-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:19:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:36Z|00106|binding|INFO|Releasing lport 580dc508-636a-420e-aed2-8efd9dccace5 from this chassis (sb_readonly=0)
Jan 22 17:19:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:36Z|00107|binding|INFO|Setting lport 580dc508-636a-420e-aed2-8efd9dccace5 down in Southbound
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:36Z|00108|binding|INFO|Removing iface tap580dc508-63 ovn-installed in OVS
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.692 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:a3:f5 10.100.0.6'], port_security=['fa:16:3e:01:a3:f5 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '67ef43d7-2b40-4e5a-99d1-f4d5d213f4d6'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '469eaf2b-7d53-40c9-a233-b27d702a21ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '18', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=580dc508-636a-420e-aed2-8efd9dccace5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.693 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 580dc508-636a-420e-aed2-8efd9dccace5 in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea unbound from our chassis#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.695 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.697 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.698 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3faef859-a70f-4f21-9ded-8f386dfc1b44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.699 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace which is not needed anymore#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.720 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:19:36 np0005592767 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 22 17:19:36 np0005592767 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 4.900s CPU time.
Jan 22 17:19:36 np0005592767 systemd-machined[153912]: Machine qemu-11-instance-00000016 terminated.
Jan 22 17:19:36 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [NOTICE]   (214489) : haproxy version is 2.8.14-c23fe91
Jan 22 17:19:36 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [NOTICE]   (214489) : path to executable is /usr/sbin/haproxy
Jan 22 17:19:36 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [WARNING]  (214489) : Exiting Master process...
Jan 22 17:19:36 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [ALERT]    (214489) : Current worker (214491) exited with code 143 (Terminated)
Jan 22 17:19:36 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[214485]: [WARNING]  (214489) : All workers exited. Exiting... (0)
Jan 22 17:19:36 np0005592767 systemd[1]: libpod-491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce.scope: Deactivated successfully.
Jan 22 17:19:36 np0005592767 podman[214814]: 2026-01-22 22:19:36.858091561 +0000 UTC m=+0.056675939 container died 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:19:36 np0005592767 systemd[1]: var-lib-containers-storage-overlay-f3d141024a03b6268a5c41a7ee2300f779c21fcaab2d181ea6e04b06ef461296-merged.mount: Deactivated successfully.
Jan 22 17:19:36 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce-userdata-shm.mount: Deactivated successfully.
Jan 22 17:19:36 np0005592767 podman[214814]: 2026-01-22 22:19:36.912277058 +0000 UTC m=+0.110861436 container cleanup 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.920 182627 DEBUG nova.virt.libvirt.guest [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.922 182627 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation has completed#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.922 182627 INFO nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] _post_live_migration() is started..#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.924 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.925 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 22 17:19:36 np0005592767 nova_compute[182623]: 2026-01-22 22:19:36.925 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 22 17:19:36 np0005592767 systemd[1]: libpod-conmon-491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce.scope: Deactivated successfully.
Jan 22 17:19:36 np0005592767 podman[214859]: 2026-01-22 22:19:36.98114452 +0000 UTC m=+0.046316031 container remove 491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.985 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e57a9725-730a-4e12-8602-3cc2870a61b2]: (4, ('Thu Jan 22 10:19:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce)\n491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce\nThu Jan 22 10:19:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce)\n491e8aa5ac3010b2c0b0ccc1f67bacdc6d1dae250f4ae8df4e8008fae6feabce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.988 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0060e8bb-0fd2-4e27-998b-d92852d09032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:36.989 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.024 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:37 np0005592767 kernel: tap698e77c5-f0: left promiscuous mode
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.044 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e4dfb22a-018f-4c26-a90e-bb75ec2cf2a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.058 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[36b02de5-dcdb-4b46-b5a7-c96ee3dc149e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b058f87e-833d-4d62-9d49-284833038f8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.079 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f691d9f2-e3cd-4d7f-9495-2e58436c19c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395361, 'reachable_time': 33362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214881, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.082 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:19:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:37.082 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[0b60e92e-0a01-468f-a032-7bafc1376aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:37 np0005592767 systemd[1]: run-netns-ovnmeta\x2d698e77c5\x2dfce6\x2d47a5\x2db6e3\x2df4c56da226ea.mount: Deactivated successfully.
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.648 182627 DEBUG nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.648 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.649 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.649 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.650 182627 DEBUG nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.650 182627 DEBUG nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.651 182627 DEBUG nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.651 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.652 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.652 182627 DEBUG oslo_concurrency.lockutils [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.653 182627 DEBUG nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.653 182627 WARNING nova.compute.manager [req-1aac09ef-7c71-4086-b2ec-11834a3c1c3b req-045b72a3-51f1-4d0e-8b3e-fc6e18be2e9b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.979 182627 DEBUG nova.network.neutron [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updated VIF entry in instance network info cache for port 580dc508-636a-420e-aed2-8efd9dccace5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:19:37 np0005592767 nova_compute[182623]: 2026-01-22 22:19:37.980 182627 DEBUG nova.network.neutron [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Updating instance_info_cache with network_info: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.007 182627 DEBUG nova.compute.manager [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.008 182627 DEBUG oslo_concurrency.lockutils [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.008 182627 DEBUG oslo_concurrency.lockutils [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.008 182627 DEBUG oslo_concurrency.lockutils [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.008 182627 DEBUG nova.compute.manager [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.009 182627 DEBUG nova.compute.manager [req-16e7fda1-0234-41ac-8946-299c72f830d2 req-83b71711-3b20-4256-8097-4edfb11bc7b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-unplugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.010 182627 DEBUG oslo_concurrency.lockutils [req-f1be03c0-ee36-4702-bb1c-cdca9f180814 req-097b55c7-f3d8-4efc-8204-f07ddab8bc68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-469eaf2b-7d53-40c9-a233-b27d702a21ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.103 182627 DEBUG nova.network.neutron [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Activated binding for port 580dc508-636a-420e-aed2-8efd9dccace5 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.103 182627 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.104 182627 DEBUG nova.virt.libvirt.vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-55126447',display_name='tempest-LiveMigrationTest-server-55126447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-55126447',id=22,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-i0lnrhmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:27Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=469eaf2b-7d53-40c9-a233-b27d702a21ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.104 182627 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "580dc508-636a-420e-aed2-8efd9dccace5", "address": "fa:16:3e:01:a3:f5", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap580dc508-63", "ovs_interfaceid": "580dc508-636a-420e-aed2-8efd9dccace5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.105 182627 DEBUG nova.network.os_vif_util [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.105 182627 DEBUG os_vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.108 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap580dc508-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.120 182627 INFO os_vif [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:a3:f5,bridge_name='br-int',has_traffic_filtering=True,id=580dc508-636a-420e-aed2-8efd9dccace5,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap580dc508-63')#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.120 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.121 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.121 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.121 182627 DEBUG nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.122 182627 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deleting instance files /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.122 182627 INFO nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Deletion of /var/lib/nova/instances/469eaf2b-7d53-40c9-a233-b27d702a21ed_del complete#033[00m
Jan 22 17:19:38 np0005592767 kernel: tap648c69ef-5b (unregistering): left promiscuous mode
Jan 22 17:19:38 np0005592767 NetworkManager[54973]: <info>  [1769120378.8754] device (tap648c69ef-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:19:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:38Z|00109|binding|INFO|Releasing lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 from this chassis (sb_readonly=0)
Jan 22 17:19:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:38Z|00110|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 down in Southbound
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.878 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:38Z|00111|binding|INFO|Removing iface tap648c69ef-5b ovn-installed in OVS
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.885 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.886 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c unbound from our chassis#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.888 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:19:38 np0005592767 nova_compute[182623]: 2026-01-22 22:19:38.898 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.904 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[80db90c8-61e0-4482-b51a-5ef73045b91d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:38 np0005592767 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 22 17:19:38 np0005592767 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Consumed 12.909s CPU time.
Jan 22 17:19:38 np0005592767 systemd-machined[153912]: Machine qemu-13-instance-00000012 terminated.
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.939 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8c4266-5118-4e85-a0f8-d69aa25f30cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.943 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c66c63e9-b80b-49e9-94fe-8bd28aea6394]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.972 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9065471d-681e-45b4-b626-29a5fbe7c482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:38.991 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d89c3998-ce2a-4723-b906-9b16a31ae4f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214894, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.009 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9fae06bf-3762-4ae8-9eff-dad8b79372af]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214895, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214895, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.011 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.013 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.017 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.017 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.017 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.018 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:39.018 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.580 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.734 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.741 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance destroyed successfully.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.748 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance destroyed successfully.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.750 182627 DEBUG nova.virt.libvirt.vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:25Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.751 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.752 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.753 182627 DEBUG os_vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.758 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.759 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap648c69ef-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.765 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.766 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.766 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.767 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.767 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.768 182627 WARNING nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.768 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.769 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.770 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.770 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.771 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.771 182627 WARNING nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.771 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.772 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.772 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.773 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.774 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] No waiting events found dispatching network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.774 182627 WARNING nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Received unexpected event network-vif-plugged-580dc508-636a-420e-aed2-8efd9dccace5 for instance with vm_state active and task_state migrating.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.775 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.775 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.775 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.776 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.776 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.776 182627 WARNING nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.776 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.777 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.777 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.777 182627 DEBUG oslo_concurrency.lockutils [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.778 182627 DEBUG nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.778 182627 WARNING nova.compute.manager [req-dff73538-5e30-40e4-b7c5-495830b145b9 req-0c704b26-cb79-42e8-8c87-9402106098ae 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.783 182627 INFO os_vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.784 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deleting instance files /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del#033[00m
Jan 22 17:19:39 np0005592767 nova_compute[182623]: 2026-01-22 22:19:39.784 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deletion of /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del complete#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.095 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.095 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating image(s)#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.096 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.096 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.097 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.109 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:40 np0005592767 podman[214914]: 2026-01-22 22:19:40.136212935 +0000 UTC m=+0.053623052 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.161 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.162 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.162 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.173 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.237 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.238 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.268 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.269 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.270 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.320 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.321 182627 DEBUG nova.virt.disk.api [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Checking if we can resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.322 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.381 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.382 182627 DEBUG nova.virt.disk.api [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Cannot resize image /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.382 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.383 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Ensure instance console log exists: /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.383 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.384 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.384 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.386 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start _get_guest_xml network_info=[{"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.391 182627 WARNING nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.396 182627 DEBUG nova.virt.libvirt.host [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.397 182627 DEBUG nova.virt.libvirt.host [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.400 182627 DEBUG nova.virt.libvirt.host [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.401 182627 DEBUG nova.virt.libvirt.host [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.402 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.403 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.403 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.403 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.403 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.404 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.404 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.404 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.404 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.405 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.405 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.405 182627 DEBUG nova.virt.hardware [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.405 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'vcpu_model' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.426 182627 DEBUG nova.virt.libvirt.vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:39Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.426 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.427 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.429 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <uuid>e6db2ef0-a660-4d03-8a2d-9574e7af17d4</uuid>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <name>instance-00000012</name>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersAdminTestJSON-server-1481617455</nova:name>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:19:40</nova:creationTime>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:user uuid="f23ea0c335b84bd2b78725d5a5491d0a">tempest-ServersAdminTestJSON-1825362070-project-member</nova:user>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:project uuid="214876cdc63543458d35ee214fe21b2c">tempest-ServersAdminTestJSON-1825362070</nova:project>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        <nova:port uuid="648c69ef-5bab-43c9-99a7-4b49b3122d56">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="serial">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="uuid">e6db2ef0-a660-4d03-8a2d-9574e7af17d4</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:ea:a6:7a"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <target dev="tap648c69ef-5b"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/console.log" append="off"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:19:40 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:19:40 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:19:40 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:19:40 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.431 182627 DEBUG nova.virt.libvirt.vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:39Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.431 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.432 182627 DEBUG nova.network.os_vif_util [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.432 182627 DEBUG os_vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.433 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.434 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.434 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.437 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap648c69ef-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.437 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap648c69ef-5b, col_values=(('external_ids', {'iface-id': '648c69ef-5bab-43c9-99a7-4b49b3122d56', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:a6:7a', 'vm-uuid': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:40 np0005592767 NetworkManager[54973]: <info>  [1769120380.4395] manager: (tap648c69ef-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.442 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.445 182627 INFO os_vif [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.524 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.525 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.525 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] No VIF found with MAC fa:16:3e:ea:a6:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.525 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Using config drive#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.538 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'ec2_ids' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.566 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'keypairs' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.920 182627 INFO nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Creating config drive at /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config#033[00m
Jan 22 17:19:40 np0005592767 nova_compute[182623]: 2026-01-22 22:19:40.924 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwesr3lae execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.046 182627 DEBUG oslo_concurrency.processutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwesr3lae" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:41 np0005592767 kernel: tap648c69ef-5b: entered promiscuous mode
Jan 22 17:19:41 np0005592767 systemd-udevd[214887]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:19:41 np0005592767 NetworkManager[54973]: <info>  [1769120381.1022] manager: (tap648c69ef-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 22 17:19:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:41Z|00112|binding|INFO|Claiming lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 for this chassis.
Jan 22 17:19:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:41Z|00113|binding|INFO|648c69ef-5bab-43c9-99a7-4b49b3122d56: Claiming fa:16:3e:ea:a6:7a 10.100.0.8
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.111 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:41 np0005592767 NetworkManager[54973]: <info>  [1769120381.1135] device (tap648c69ef-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:19:41 np0005592767 NetworkManager[54973]: <info>  [1769120381.1141] device (tap648c69ef-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.113 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c bound to our chassis#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.115 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:19:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:41Z|00114|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 ovn-installed in OVS
Jan 22 17:19:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:41Z|00115|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 up in Southbound
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.121 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.130 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1061412a-8cf1-49b2-9e9c-c7c66839f63f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 systemd-machined[153912]: New machine qemu-15-instance-00000012.
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.158 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6270298a-2a20-43a6-883d-070abb5b5b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.160 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5713b916-a2ad-4650-83fc-a3a9fb16b0bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.184 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ec8783-1b4e-4dd7-8faa-2ceefd821a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.199 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5d307e8b-c42b-4a52-9688-6fd5db820f2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214980, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.213 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[361aa76d-c676-4619-8ea9-5fc965f8b7dd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214985, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214985, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.216 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.217 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.220 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.219 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.219 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.220 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:41.220 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.935 182627 DEBUG nova.compute.manager [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.935 182627 DEBUG oslo_concurrency.lockutils [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.936 182627 DEBUG oslo_concurrency.lockutils [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.936 182627 DEBUG oslo_concurrency.lockutils [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.936 182627 DEBUG nova.compute.manager [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:41 np0005592767 nova_compute[182623]: 2026-01-22 22:19:41.936 182627 WARNING nova.compute.manager [req-d664dea5-c2c6-4714-b957-5cc8959c1c71 req-cbb22c19-611e-46e6-8f71-73ebbb837801 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 22 17:19:42 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:19:42 np0005592767 systemd[214722]: Activating special unit Exit the Session...
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped target Main User Target.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped target Basic System.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped target Paths.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped target Sockets.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped target Timers.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:19:42 np0005592767 systemd[214722]: Closed D-Bus User Message Bus Socket.
Jan 22 17:19:42 np0005592767 systemd[214722]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:19:42 np0005592767 systemd[214722]: Removed slice User Application Slice.
Jan 22 17:19:42 np0005592767 systemd[214722]: Reached target Shutdown.
Jan 22 17:19:42 np0005592767 systemd[214722]: Finished Exit the Session.
Jan 22 17:19:42 np0005592767 systemd[214722]: Reached target Exit the Session.
Jan 22 17:19:42 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:19:42 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:19:42 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:19:42 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:19:42 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:19:42 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:19:42 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.745 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for e6db2ef0-a660-4d03-8a2d-9574e7af17d4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.747 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120382.7454548, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.747 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.750 182627 DEBUG nova.compute.manager [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.750 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.754 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance spawned successfully.#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.755 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.771 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.776 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.779 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.779 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.780 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.780 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.780 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.781 182627 DEBUG nova.virt.libvirt.driver [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.820 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.821 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120382.74634, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.821 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.891 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.894 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.931 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:19:42 np0005592767 nova_compute[182623]: 2026-01-22 22:19:42.940 182627 DEBUG nova.compute.manager [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:43 np0005592767 nova_compute[182623]: 2026-01-22 22:19:43.052 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:43 np0005592767 nova_compute[182623]: 2026-01-22 22:19:43.053 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:43 np0005592767 nova_compute[182623]: 2026-01-22 22:19:43.053 182627 DEBUG nova.objects.instance [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:19:43 np0005592767 nova_compute[182623]: 2026-01-22 22:19:43.167 182627 DEBUG oslo_concurrency.lockutils [None req-9ce8861e-f0ae-4a0b-a9b2-51834a555321 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.089 182627 DEBUG nova.compute.manager [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.090 182627 DEBUG oslo_concurrency.lockutils [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.090 182627 DEBUG oslo_concurrency.lockutils [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.090 182627 DEBUG oslo_concurrency.lockutils [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.090 182627 DEBUG nova.compute.manager [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.090 182627 WARNING nova.compute.manager [req-564795b0-bf89-44a7-9eea-8ab9d5025d50 req-2f98cadf-70ac-47d1-b8af-9f672801eeb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.141 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.141 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.142 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "469eaf2b-7d53-40c9-a233-b27d702a21ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.175 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.175 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.176 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.176 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.244 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.316 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.317 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.381 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.387 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.477 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.479 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.543 182627 DEBUG oslo_concurrency.processutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.583 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.699 182627 WARNING nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.700 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5429MB free_disk=73.24525833129883GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.701 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.701 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.757 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration for instance 469eaf2b-7d53-40c9-a233-b27d702a21ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.781 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.839 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance e6db2ef0-a660-4d03-8a2d-9574e7af17d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.840 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Instance 2925f68f-5cfe-47c2-b952-de9856d8ab82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.841 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Migration 4a17ec75-f718-4d4a-b545-0a0698891226 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.841 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:19:44 np0005592767 nova_compute[182623]: 2026-01-22 22:19:44.841 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.264 182627 DEBUG nova.compute.provider_tree [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.280 182627 DEBUG nova.scheduler.client.report [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.305 182627 DEBUG nova.compute.resource_tracker [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.305 182627 DEBUG oslo_concurrency.lockutils [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.320 182627 INFO nova.compute.manager [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.441 182627 INFO nova.scheduler.client.report [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Deleted allocation for migration 4a17ec75-f718-4d4a-b545-0a0698891226#033[00m
Jan 22 17:19:45 np0005592767 nova_compute[182623]: 2026-01-22 22:19:45.442 182627 DEBUG nova.virt.libvirt.driver [None req-584a0f40-2a93-490d-8813-aecec13d07db ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.278 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.279 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.279 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.280 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.280 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.295 182627 INFO nova.compute.manager [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Terminating instance#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.310 182627 DEBUG nova.compute.manager [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:19:49 np0005592767 kernel: tap598d7930-e9 (unregistering): left promiscuous mode
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.333 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120374.330296, d3c0bb3e-4a49-475d-aa25-b992d112af28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:49 np0005592767 NetworkManager[54973]: <info>  [1769120389.3340] device (tap598d7930-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.334 182627 INFO nova.compute.manager [-] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.354 182627 DEBUG nova.compute.manager [None req-52867113-2789-413a-a89f-104abec8d767 - - - - - -] [instance: d3c0bb3e-4a49-475d-aa25-b992d112af28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:49Z|00116|binding|INFO|Releasing lport 598d7930-e98a-4d8d-b339-a3edf37f15dd from this chassis (sb_readonly=0)
Jan 22 17:19:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:49Z|00117|binding|INFO|Setting lport 598d7930-e98a-4d8d-b339-a3edf37f15dd down in Southbound
Jan 22 17:19:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:49Z|00118|binding|INFO|Removing iface tap598d7930-e9 ovn-installed in OVS
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.383 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.396 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:9b:74 10.100.0.7'], port_security=['fa:16:3e:a2:9b:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2925f68f-5cfe-47c2-b952-de9856d8ab82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=598d7930-e98a-4d8d-b339-a3edf37f15dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.397 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 598d7930-e98a-4d8d-b339-a3edf37f15dd in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c unbound from our chassis#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.400 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.404 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.414 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8d9a0a-54d1-4180-82e0-0c026665093b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 22 17:19:49 np0005592767 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000015.scope: Consumed 15.666s CPU time.
Jan 22 17:19:49 np0005592767 systemd-machined[153912]: Machine qemu-10-instance-00000015 terminated.
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.444 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9cac1e6e-5690-4bb0-ae00-f3e26abe2b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.446 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[29794921-9b35-4e23-a6b2-8483f7d31b1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.472 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cea22d7e-a68f-4fd8-b441-c27f730e4787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.485 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[30d71d45-132c-4075-b1a6-a658ece037da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19dd816f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:72:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391498, 'reachable_time': 29156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215022, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.497 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[946ab842-080b-4a0c-8bb9-4cabc2e1c61c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391507, 'tstamp': 391507}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215023, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19dd816f-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391510, 'tstamp': 391510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215023, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.499 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.500 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.505 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19dd816f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.505 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.505 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19dd816f-60, col_values=(('external_ids', {'iface-id': '32bed344-462e-4b45-8eb9-1fd48f73f73c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:49.505 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.575 182627 INFO nova.virt.libvirt.driver [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Instance destroyed successfully.#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.575 182627 DEBUG nova.objects.instance [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'resources' on Instance uuid 2925f68f-5cfe-47c2-b952-de9856d8ab82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.585 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.590 182627 DEBUG nova.virt.libvirt.vif [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1179523829',display_name='tempest-ServersAdminTestJSON-server-1179523829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1179523829',id=21,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:18:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-6vn4bppz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:18:53Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=2925f68f-5cfe-47c2-b952-de9856d8ab82,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.590 182627 DEBUG nova.network.os_vif_util [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "address": "fa:16:3e:a2:9b:74", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap598d7930-e9", "ovs_interfaceid": "598d7930-e98a-4d8d-b339-a3edf37f15dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.591 182627 DEBUG nova.network.os_vif_util [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.591 182627 DEBUG os_vif [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.593 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.593 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap598d7930-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.595 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.596 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.598 182627 INFO os_vif [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:9b:74,bridge_name='br-int',has_traffic_filtering=True,id=598d7930-e98a-4d8d-b339-a3edf37f15dd,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap598d7930-e9')#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.599 182627 INFO nova.virt.libvirt.driver [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Deleting instance files /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82_del#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.600 182627 INFO nova.virt.libvirt.driver [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Deletion of /var/lib/nova/instances/2925f68f-5cfe-47c2-b952-de9856d8ab82_del complete#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.686 182627 INFO nova.compute.manager [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.687 182627 DEBUG oslo.service.loopingcall [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.687 182627 DEBUG nova.compute.manager [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:19:49 np0005592767 nova_compute[182623]: 2026-01-22 22:19:49.687 182627 DEBUG nova.network.neutron [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.441 182627 DEBUG nova.network.neutron [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.460 182627 INFO nova.compute.manager [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Took 0.77 seconds to deallocate network for instance.#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.565 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.566 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.573 182627 DEBUG nova.compute.manager [req-b1778c60-f74e-46ef-a5cb-89baccec5afa req-e6ff8359-609a-4767-a731-68187e49aaf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-vif-deleted-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.635 182627 DEBUG nova.compute.provider_tree [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.653 182627 DEBUG nova.scheduler.client.report [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.673 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.697 182627 INFO nova.scheduler.client.report [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Deleted allocations for instance 2925f68f-5cfe-47c2-b952-de9856d8ab82#033[00m
Jan 22 17:19:50 np0005592767 nova_compute[182623]: 2026-01-22 22:19:50.788 182627 DEBUG oslo_concurrency.lockutils [None req-91b8647f-bca0-40f0-b84b-ce3840c7ebd0 f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:51 np0005592767 nova_compute[182623]: 2026-01-22 22:19:51.921 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120376.9201005, 469eaf2b-7d53-40c9-a233-b27d702a21ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:19:51 np0005592767 nova_compute[182623]: 2026-01-22 22:19:51.921 182627 INFO nova.compute.manager [-] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:19:51 np0005592767 nova_compute[182623]: 2026-01-22 22:19:51.940 182627 DEBUG nova.compute.manager [None req-1585d7f6-fb12-4e64-b87c-41f990f54d2b - - - - - -] [instance: 469eaf2b-7d53-40c9-a233-b27d702a21ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.048 182627 DEBUG nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-vif-unplugged-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.048 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.048 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.049 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.049 182627 DEBUG nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] No waiting events found dispatching network-vif-unplugged-598d7930-e98a-4d8d-b339-a3edf37f15dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.049 182627 WARNING nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received unexpected event network-vif-unplugged-598d7930-e98a-4d8d-b339-a3edf37f15dd for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.050 182627 DEBUG nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.050 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.050 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.050 182627 DEBUG oslo_concurrency.lockutils [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2925f68f-5cfe-47c2-b952-de9856d8ab82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.051 182627 DEBUG nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] No waiting events found dispatching network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:52 np0005592767 nova_compute[182623]: 2026-01-22 22:19:52.051 182627 WARNING nova.compute.manager [req-be8a47e3-2db5-4fc3-a8a0-d75038efc29d req-3f6fbc84-14df-4e2d-a7c1-3de6fe9154b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Received unexpected event network-vif-plugged-598d7930-e98a-4d8d-b339-a3edf37f15dd for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:19:52 np0005592767 podman[215042]: 2026-01-22 22:19:52.190351695 +0000 UTC m=+0.085733551 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.122 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.123 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.123 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.123 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.123 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.134 182627 INFO nova.compute.manager [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Terminating instance#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.143 182627 DEBUG nova.compute.manager [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:19:53 np0005592767 kernel: tap648c69ef-5b (unregistering): left promiscuous mode
Jan 22 17:19:53 np0005592767 NetworkManager[54973]: <info>  [1769120393.1615] device (tap648c69ef-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.165 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:53Z|00119|binding|INFO|Releasing lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 from this chassis (sb_readonly=0)
Jan 22 17:19:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:53Z|00120|binding|INFO|Setting lport 648c69ef-5bab-43c9-99a7-4b49b3122d56 down in Southbound
Jan 22 17:19:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:19:53Z|00121|binding|INFO|Removing iface tap648c69ef-5b ovn-installed in OVS
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.173 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:a6:7a 10.100.0.8'], port_security=['fa:16:3e:ea:a6:7a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e6db2ef0-a660-4d03-8a2d-9574e7af17d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '214876cdc63543458d35ee214fe21b2c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '78ee0b7c-9320-4ff9-9442-9377451949b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=824e1618-f9e7-48da-98bd-2fdc50a3dd94, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=648c69ef-5bab-43c9-99a7-4b49b3122d56) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.174 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 648c69ef-5bab-43c9-99a7-4b49b3122d56 in datapath 19dd816f-669a-4bda-b508-a3ddcd4c2d7c unbound from our chassis#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.176 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19dd816f-669a-4bda-b508-a3ddcd4c2d7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.177 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5377b6f4-26c8-4a67-9306-2301cbde6327]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.178 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c namespace which is not needed anymore#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.198 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 22 17:19:53 np0005592767 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 11.757s CPU time.
Jan 22 17:19:53 np0005592767 systemd-machined[153912]: Machine qemu-15-instance-00000012 terminated.
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [NOTICE]   (213991) : haproxy version is 2.8.14-c23fe91
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [NOTICE]   (213991) : path to executable is /usr/sbin/haproxy
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [WARNING]  (213991) : Exiting Master process...
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [WARNING]  (213991) : Exiting Master process...
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [ALERT]    (213991) : Current worker (213993) exited with code 143 (Terminated)
Jan 22 17:19:53 np0005592767 neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c[213987]: [WARNING]  (213991) : All workers exited. Exiting... (0)
Jan 22 17:19:53 np0005592767 systemd[1]: libpod-33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a.scope: Deactivated successfully.
Jan 22 17:19:53 np0005592767 podman[215085]: 2026-01-22 22:19:53.315326456 +0000 UTC m=+0.045363083 container died 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:19:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:19:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay-086ab037f8e24150776b7079183c8b007a36dce03c95b0d49b2e7350dd1b76bb-merged.mount: Deactivated successfully.
Jan 22 17:19:53 np0005592767 podman[215085]: 2026-01-22 22:19:53.34659046 +0000 UTC m=+0.076627087 container cleanup 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:19:53 np0005592767 systemd[1]: libpod-conmon-33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a.scope: Deactivated successfully.
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.406 182627 INFO nova.virt.libvirt.driver [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Instance destroyed successfully.#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.407 182627 DEBUG nova.objects.instance [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lazy-loading 'resources' on Instance uuid e6db2ef0-a660-4d03-8a2d-9574e7af17d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:19:53 np0005592767 podman[215117]: 2026-01-22 22:19:53.415141382 +0000 UTC m=+0.042088078 container remove 33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.420 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c566d28-b2f6-4e08-86ee-1424493c411a]: (4, ('Thu Jan 22 10:19:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c (33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a)\n33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a\nThu Jan 22 10:19:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c (33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a)\n33c512be95bc8509a07cd4b42973f9ad8bb61837d56e5986377dae348bc5883a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.422 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[293640cd-379a-4a7d-8fe1-34add9ec02d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.422 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19dd816f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.425 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 kernel: tap19dd816f-60: left promiscuous mode
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.430 182627 DEBUG nova.virt.libvirt.vif [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:18:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1481617455',display_name='tempest-ServersAdminTestJSON-server-1481617455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1481617455',id=18,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='214876cdc63543458d35ee214fe21b2c',ramdisk_id='',reservation_id='r-nenin4tn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1825362070',owner_user_name='tempest-ServersAdminTestJSON-1825362070-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:19:45Z,user_data=None,user_id='f23ea0c335b84bd2b78725d5a5491d0a',uuid=e6db2ef0-a660-4d03-8a2d-9574e7af17d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.430 182627 DEBUG nova.network.os_vif_util [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converting VIF {"id": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "address": "fa:16:3e:ea:a6:7a", "network": {"id": "19dd816f-669a-4bda-b508-a3ddcd4c2d7c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543215627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "214876cdc63543458d35ee214fe21b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap648c69ef-5b", "ovs_interfaceid": "648c69ef-5bab-43c9-99a7-4b49b3122d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.431 182627 DEBUG nova.network.os_vif_util [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.432 182627 DEBUG os_vif [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.433 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.434 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap648c69ef-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.435 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.440 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[38f2146b-da95-433a-a64a-0514ecf80c92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.441 182627 INFO os_vif [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:a6:7a,bridge_name='br-int',has_traffic_filtering=True,id=648c69ef-5bab-43c9-99a7-4b49b3122d56,network=Network(19dd816f-669a-4bda-b508-a3ddcd4c2d7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap648c69ef-5b')#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.441 182627 INFO nova.virt.libvirt.driver [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deleting instance files /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.442 182627 INFO nova.virt.libvirt.driver [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deletion of /var/lib/nova/instances/e6db2ef0-a660-4d03-8a2d-9574e7af17d4_del complete#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.459 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a02e55d-fa3e-430f-8a37-a9c38757344b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.460 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[285484d6-afdf-49e6-b2e3-1f381cc2ffa6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.477 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a0caccfe-24ac-4424-8a52-9781cebb5eb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391492, 'reachable_time': 19463, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215149, 'error': None, 'target': 'ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 systemd[1]: run-netns-ovnmeta\x2d19dd816f\x2d669a\x2d4bda\x2db508\x2da3ddcd4c2d7c.mount: Deactivated successfully.
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.479 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19dd816f-669a-4bda-b508-a3ddcd4c2d7c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:19:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:19:53.479 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc723be-b708-47e2-a720-5f1e5f9fc9f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.528 182627 INFO nova.compute.manager [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.529 182627 DEBUG oslo.service.loopingcall [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.530 182627 DEBUG nova.compute.manager [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:19:53 np0005592767 nova_compute[182623]: 2026-01-22 22:19:53.530 182627 DEBUG nova.network.neutron [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.161 182627 DEBUG nova.network.neutron [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.179 182627 INFO nova.compute.manager [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Took 0.65 seconds to deallocate network for instance.#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.235 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.236 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.270 182627 DEBUG nova.compute.manager [req-4f0e2cdd-6e7a-482f-9db1-8ce4e94e5f83 req-7c82f682-2373-479c-9950-aa907f41c17a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-deleted-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.300 182627 DEBUG nova.compute.provider_tree [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.319 182627 DEBUG nova.scheduler.client.report [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.342 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.370 182627 INFO nova.scheduler.client.report [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Deleted allocations for instance e6db2ef0-a660-4d03-8a2d-9574e7af17d4#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.442 182627 DEBUG oslo_concurrency.lockutils [None req-c9cb3643-2d79-47d6-8099-89b39edeb01d f23ea0c335b84bd2b78725d5a5491d0a 214876cdc63543458d35ee214fe21b2c - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.586 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.741 182627 DEBUG nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.742 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.742 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.743 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.743 182627 DEBUG nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.743 182627 WARNING nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-unplugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.744 182627 DEBUG nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.744 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.745 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.746 182627 DEBUG oslo_concurrency.lockutils [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e6db2ef0-a660-4d03-8a2d-9574e7af17d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.746 182627 DEBUG nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] No waiting events found dispatching network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:19:54 np0005592767 nova_compute[182623]: 2026-01-22 22:19:54.747 182627 WARNING nova.compute.manager [req-92f28484-fcb9-417d-abf4-e8ab4f49ce07 req-cd6d17ef-b535-4c71-b02c-c6725a21034f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Received unexpected event network-vif-plugged-648c69ef-5bab-43c9-99a7-4b49b3122d56 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:19:56 np0005592767 podman[215151]: 2026-01-22 22:19:56.198038316 +0000 UTC m=+0.114910715 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:19:56 np0005592767 podman[215150]: 2026-01-22 22:19:56.21413451 +0000 UTC m=+0.134645414 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 17:19:56 np0005592767 nova_compute[182623]: 2026-01-22 22:19:56.229 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:19:58 np0005592767 nova_compute[182623]: 2026-01-22 22:19:58.454 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:58 np0005592767 nova_compute[182623]: 2026-01-22 22:19:58.649 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:19:59 np0005592767 nova_compute[182623]: 2026-01-22 22:19:59.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:02 np0005592767 podman[215196]: 2026-01-22 22:20:02.136061574 +0000 UTC m=+0.049282946 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:20:02 np0005592767 nova_compute[182623]: 2026-01-22 22:20:02.263 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating tmpfile /var/lib/nova/instances/tmp73gi6c2o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 22 17:20:02 np0005592767 nova_compute[182623]: 2026-01-22 22:20:02.264 182627 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 22 17:20:03 np0005592767 podman[215217]: 2026-01-22 22:20:03.130216103 +0000 UTC m=+0.051226293 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:20:03 np0005592767 nova_compute[182623]: 2026-01-22 22:20:03.456 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:03 np0005592767 nova_compute[182623]: 2026-01-22 22:20:03.895 182627 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 22 17:20:03 np0005592767 nova_compute[182623]: 2026-01-22 22:20:03.922 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:20:03 np0005592767 nova_compute[182623]: 2026-01-22 22:20:03.922 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:20:03 np0005592767 nova_compute[182623]: 2026-01-22 22:20:03.923 182627 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:20:04 np0005592767 nova_compute[182623]: 2026-01-22 22:20:04.574 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120389.5724142, 2925f68f-5cfe-47c2-b952-de9856d8ab82 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:04 np0005592767 nova_compute[182623]: 2026-01-22 22:20:04.575 182627 INFO nova.compute.manager [-] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:20:04 np0005592767 nova_compute[182623]: 2026-01-22 22:20:04.595 182627 DEBUG nova.compute.manager [None req-ba393be4-ddb1-4dd7-a7c0-37bdd21e3402 - - - - - -] [instance: 2925f68f-5cfe-47c2-b952-de9856d8ab82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:04 np0005592767 nova_compute[182623]: 2026-01-22 22:20:04.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:04 np0005592767 nova_compute[182623]: 2026-01-22 22:20:04.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.788 182627 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.829 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.841 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.842 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating instance directory: /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.843 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Creating disk.info with the contents: {'/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk': 'qcow2', '/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.843 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.844 182627 DEBUG nova.objects.instance [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'trusted_certs' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.868 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.932 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.933 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.933 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:05 np0005592767 nova_compute[182623]: 2026-01-22 22:20:05.943 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.003 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.004 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.055 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.056 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.057 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.115 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.116 182627 DEBUG nova.virt.disk.api [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Checking if we can resize image /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.117 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.172 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.173 182627 DEBUG nova.virt.disk.api [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Cannot resize image /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.174 182627 DEBUG nova.objects.instance [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lazy-loading 'migration_context' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.195 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.222 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.225 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config to /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.226 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.770 182627 DEBUG oslo_concurrency.processutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk.config /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.771 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.773 182627 DEBUG nova.virt.libvirt.vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:19:59Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.773 182627 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.774 182627 DEBUG nova.network.os_vif_util [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.775 182627 DEBUG os_vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.776 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.776 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.781 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e88b712-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.781 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e88b712-be, col_values=(('external_ids', {'iface-id': '7e88b712-bef4-4434-b405-04af2a2d3d0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:07:e9', 'vm-uuid': 'b484b5f7-0814-4161-b492-633788f2961f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:06 np0005592767 NetworkManager[54973]: <info>  [1769120406.7855] manager: (tap7e88b712-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.786 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.791 182627 INFO os_vif [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be')#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.792 182627 DEBUG nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.792 182627 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:06 np0005592767 nova_compute[182623]: 2026-01-22 22:20:06.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:20:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:20:07 np0005592767 nova_compute[182623]: 2026-01-22 22:20:07.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:07 np0005592767 nova_compute[182623]: 2026-01-22 22:20:07.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:20:07 np0005592767 nova_compute[182623]: 2026-01-22 22:20:07.966 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:20:07 np0005592767 nova_compute[182623]: 2026-01-22 22:20:07.967 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.405 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120393.4039996, e6db2ef0-a660-4d03-8a2d-9574e7af17d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.406 182627 INFO nova.compute.manager [-] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.433 182627 DEBUG nova.compute.manager [None req-8e269de4-5f72-4873-9c9b-58dbbb5ee2fb - - - - - -] [instance: e6db2ef0-a660-4d03-8a2d-9574e7af17d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.653 182627 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Port 7e88b712-bef4-4434-b405-04af2a2d3d0f updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.668 182627 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp73gi6c2o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b484b5f7-0814-4161-b492-633788f2961f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:20:08 np0005592767 kernel: tap7e88b712-be: entered promiscuous mode
Jan 22 17:20:08 np0005592767 NetworkManager[54973]: <info>  [1769120408.9416] manager: (tap7e88b712-be): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.942 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:08Z|00122|binding|INFO|Claiming lport 7e88b712-bef4-4434-b405-04af2a2d3d0f for this additional chassis.
Jan 22 17:20:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:08Z|00123|binding|INFO|7e88b712-bef4-4434-b405-04af2a2d3d0f: Claiming fa:16:3e:ad:07:e9 10.100.0.14
Jan 22 17:20:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:08Z|00124|binding|INFO|Claiming lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 for this additional chassis.
Jan 22 17:20:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:08Z|00125|binding|INFO|a87b1642-1ac3-4b35-809d-79c74a2f4e13: Claiming fa:16:3e:ac:a8:b5 19.80.0.59
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.945 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:08 np0005592767 nova_compute[182623]: 2026-01-22 22:20:08.950 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:08 np0005592767 systemd-udevd[215279]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:20:08 np0005592767 systemd-machined[153912]: New machine qemu-16-instance-0000001e.
Jan 22 17:20:09 np0005592767 NetworkManager[54973]: <info>  [1769120409.0039] device (tap7e88b712-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:20:09 np0005592767 NetworkManager[54973]: <info>  [1769120409.0045] device (tap7e88b712-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:20:09 np0005592767 nova_compute[182623]: 2026-01-22 22:20:09.028 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:09Z|00126|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f ovn-installed in OVS
Jan 22 17:20:09 np0005592767 nova_compute[182623]: 2026-01-22 22:20:09.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:09 np0005592767 systemd[1]: Started Virtual Machine qemu-16-instance-0000001e.
Jan 22 17:20:09 np0005592767 nova_compute[182623]: 2026-01-22 22:20:09.785 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:09 np0005592767 nova_compute[182623]: 2026-01-22 22:20:09.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:11 np0005592767 podman[215296]: 2026-01-22 22:20:11.179779263 +0000 UTC m=+0.080915801 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:20:11 np0005592767 nova_compute[182623]: 2026-01-22 22:20:11.635 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120411.6351635, b484b5f7-0814-4161-b492-633788f2961f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:11 np0005592767 nova_compute[182623]: 2026-01-22 22:20:11.636 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Started (Lifecycle Event)#033[00m
Jan 22 17:20:11 np0005592767 nova_compute[182623]: 2026-01-22 22:20:11.662 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:11 np0005592767 nova_compute[182623]: 2026-01-22 22:20:11.784 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:12.092 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:12.093 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:12.093 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.621 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120412.6214108, b484b5f7-0814-4161-b492-633788f2961f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.622 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.641 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.645 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.662 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.915 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.916 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:20:12 np0005592767 nova_compute[182623]: 2026-01-22 22:20:12.998 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.052 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.053 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.111 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:13Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:07:e9 10.100.0.14
Jan 22 17:20:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:13Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:07:e9 10.100.0.14
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.288 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.289 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5607MB free_disk=73.24757766723633GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.290 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.290 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.339 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration for instance b484b5f7-0814-4161-b492-633788f2961f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.358 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating resource usage from migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.359 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Starting to track incoming migration 71b7dbf7-048d-47da-b5e0-c5906bcb8587 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.420 182627 WARNING nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance b484b5f7-0814-4161-b492-633788f2961f has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.421 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.421 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.470 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.484 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.507 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:20:13 np0005592767 nova_compute[182623]: 2026-01-22 22:20:13.507 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00127|binding|INFO|Claiming lport 7e88b712-bef4-4434-b405-04af2a2d3d0f for this chassis.
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00128|binding|INFO|7e88b712-bef4-4434-b405-04af2a2d3d0f: Claiming fa:16:3e:ad:07:e9 10.100.0.14
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00129|binding|INFO|Claiming lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 for this chassis.
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00130|binding|INFO|a87b1642-1ac3-4b35-809d-79c74a2f4e13: Claiming fa:16:3e:ac:a8:b5 19.80.0.59
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00131|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f up in Southbound
Jan 22 17:20:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:14Z|00132|binding|INFO|Setting lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 up in Southbound
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.824 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:07:e9 10.100.0.14'], port_security=['fa:16:3e:ad:07:e9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-850980191', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-850980191', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7e88b712-bef4-4434-b405-04af2a2d3d0f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.826 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:a8:b5 19.80.0.59'], port_security=['fa:16:3e:ac:a8:b5 19.80.0.59'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['7e88b712-bef4-4434-b405-04af2a2d3d0f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1993138940', 'neutron:cidrs': '19.80.0.59/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1993138940', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=529a2a70-69a8-4f19-a951-f1c58852ecd0, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a87b1642-1ac3-4b35-809d-79c74a2f4e13) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.827 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7e88b712-bef4-4434-b405-04af2a2d3d0f in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea bound to our chassis#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.828 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea#033[00m
Jan 22 17:20:14 np0005592767 nova_compute[182623]: 2026-01-22 22:20:14.839 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.839 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7168e3-f0c7-48d3-9982-3d058d24d2a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.841 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap698e77c5-f1 in ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.844 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap698e77c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.844 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e4edc8-33be-44b9-aef0-205f79d1fb91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.844 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[34bbc846-516d-4e26-a10a-580f1f8c1bf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.858 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[5b776cb7-3ab6-4a58-ad6d-94d4244eb996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.880 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb91431c-f002-41e2-9038-213beda39d23]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.915 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4c1c2096-cc49-4a8c-9a59-15f9254825fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.921 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c3387bc9-34eb-4a6b-af19-28dac91cccfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 NetworkManager[54973]: <info>  [1769120414.9232] manager: (tap698e77c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 22 17:20:14 np0005592767 systemd-udevd[215344]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.957 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[27b880c7-5c63-4989-b924-ec2e0aada91a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.960 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b39b0926-f224-4218-bc54-a6b3eaa35d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:14 np0005592767 NetworkManager[54973]: <info>  [1769120414.9880] device (tap698e77c5-f0): carrier: link connected
Jan 22 17:20:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:14.993 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8e959bb9-25bb-4335-88ac-d9bf1d8cc30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.013 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8023fb3b-dbb7-47ab-b622-292209e26a15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401156, 'reachable_time': 28794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215363, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.029 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0c06374f-ca58-491e-a94d-f93e0db71d33]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:3733'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401156, 'tstamp': 401156}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215364, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.047 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbb6aa0-d5a7-4b7c-9090-e43f82f3979c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap698e77c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:37:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401156, 'reachable_time': 28794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215365, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.076 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4a29ddf6-8a76-40f9-a8ad-e3b9075f9414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.138 182627 INFO nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Post operation of migration started#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9608ae-10a7-4832-a89e-4dfe53888741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.139 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.140 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.140 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap698e77c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.143 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 NetworkManager[54973]: <info>  [1769120415.1437] manager: (tap698e77c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 22 17:20:15 np0005592767 kernel: tap698e77c5-f0: entered promiscuous mode
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.146 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap698e77c5-f0, col_values=(('external_ids', {'iface-id': 'a18a2be2-f1a5-44ce-96ac-2c546dab3eef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:15Z|00133|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.163 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.164 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c26a30e5-cd6e-4b6b-b22d-af02f96d66e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.165 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/698e77c5-fce6-47a5-b6e3-f4c56da226ea.pid.haproxy
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 698e77c5-fce6-47a5-b6e3-f4c56da226ea
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.166 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'env', 'PROCESS_TAG=haproxy-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/698e77c5-fce6-47a5-b6e3-f4c56da226ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:20:15 np0005592767 podman[215398]: 2026-01-22 22:20:15.533419967 +0000 UTC m=+0.047671890 container create ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:20:15 np0005592767 systemd[1]: Started libpod-conmon-ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644.scope.
Jan 22 17:20:15 np0005592767 podman[215398]: 2026-01-22 22:20:15.506828028 +0000 UTC m=+0.021079931 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:20:15 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:20:15 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb6bcc7caa66b6759c18b33e929aa2ee195344d82726a02fc6fd42087e216feb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:20:15 np0005592767 podman[215398]: 2026-01-22 22:20:15.628997811 +0000 UTC m=+0.143249784 container init ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:20:15 np0005592767 podman[215398]: 2026-01-22 22:20:15.634229132 +0000 UTC m=+0.148481055 container start ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:20:15 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [NOTICE]   (215418) : New worker (215420) forked
Jan 22 17:20:15 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [NOTICE]   (215418) : Loading success.
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.685 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.686 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquired lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.686 182627 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.689 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a87b1642-1ac3-4b35-809d-79c74a2f4e13 in datapath 75073b6a-f711-4d82-9e11-07cd8a1d16e2 bound to our chassis#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.691 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75073b6a-f711-4d82-9e11-07cd8a1d16e2#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.703 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[07b48e9c-5653-4db8-a546-e47355754f8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.704 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75073b6a-f1 in ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.706 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75073b6a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.706 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b6be3eba-8f25-4a08-9721-f93b6072502a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.707 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[48c7222b-d71c-4b28-8c2b-2b8a9fb43a85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.720 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c209b92e-7c53-4e99-853f-6156ef9b7620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.734 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1975806c-07a6-49de-bc95-2d27b25a9a9c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.755 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd43589-23dc-4ebe-a3fa-71607e5fa7d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.760 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8047f425-e712-4733-9495-3971d4c50f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 NetworkManager[54973]: <info>  [1769120415.7620] manager: (tap75073b6a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.795 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[07628470-784e-4e73-98d5-ca2db6b51acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.799 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0a0a1a-b3d3-4848-a6c4-78a884168080]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 NetworkManager[54973]: <info>  [1769120415.8298] device (tap75073b6a-f0): carrier: link connected
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.834 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[61c83076-7d7f-427a-80b8-9ac858cb45db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.852 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe36394-5dd2-4539-a619-782c6bce100b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75073b6a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:2f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401240, 'reachable_time': 40822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215439, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.866 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[35b4f4d8-590f-4a1e-80c1-fa1032a151d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:2fb5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401240, 'tstamp': 401240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215440, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.882 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[518c9c29-af98-4f24-a30c-6d28532804c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75073b6a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:2f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401240, 'reachable_time': 40822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215441, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.905 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[803ba172-7cec-46a7-9323-d6cd0e55d8a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.962 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4b26dd-a50b-447d-b1c0-a30e21c491ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.964 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75073b6a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.965 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.966 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75073b6a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 NetworkManager[54973]: <info>  [1769120415.9700] manager: (tap75073b6a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 22 17:20:15 np0005592767 kernel: tap75073b6a-f0: entered promiscuous mode
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.970 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.972 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75073b6a-f0, col_values=(('external_ids', {'iface-id': '65e3ee7d-2176-49c3-aeee-08b035ff0bbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.973 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:15Z|00134|binding|INFO|Releasing lport 65e3ee7d-2176-49c3-aeee-08b035ff0bbf from this chassis (sb_readonly=0)
Jan 22 17:20:15 np0005592767 nova_compute[182623]: 2026-01-22 22:20:15.991 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.992 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.994 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c9172707-bf28-4a78-9901-56c8ec76ca1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.995 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-75073b6a-f711-4d82-9e11-07cd8a1d16e2
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/75073b6a-f711-4d82-9e11-07cd8a1d16e2.pid.haproxy
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 75073b6a-f711-4d82-9e11-07cd8a1d16e2
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:20:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:15.996 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'env', 'PROCESS_TAG=haproxy-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75073b6a-f711-4d82-9e11-07cd8a1d16e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:20:16 np0005592767 podman[215473]: 2026-01-22 22:20:16.363783078 +0000 UTC m=+0.042205271 container create 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:20:16 np0005592767 systemd[1]: Started libpod-conmon-9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9.scope.
Jan 22 17:20:16 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:20:16 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8a1445b690dd10760714c466bbc6c46cc311e4ced1cddf4037fcc8d9cf4124f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:20:16 np0005592767 podman[215473]: 2026-01-22 22:20:16.42400045 +0000 UTC m=+0.102422663 container init 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:20:16 np0005592767 podman[215473]: 2026-01-22 22:20:16.428766118 +0000 UTC m=+0.107188311 container start 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:20:16 np0005592767 podman[215473]: 2026-01-22 22:20:16.341986118 +0000 UTC m=+0.020408341 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:20:16 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [NOTICE]   (215493) : New worker (215495) forked
Jan 22 17:20:16 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [NOTICE]   (215493) : Loading success.
Jan 22 17:20:16 np0005592767 nova_compute[182623]: 2026-01-22 22:20:16.786 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.128 182627 DEBUG nova.network.neutron [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [{"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.142 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Releasing lock "refresh_cache-b484b5f7-0814-4161-b492-633788f2961f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.165 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.165 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.166 182627 DEBUG oslo_concurrency.lockutils [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:18 np0005592767 nova_compute[182623]: 2026-01-22 22:20:18.169 182627 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 22 17:20:18 np0005592767 virtqemud[182095]: Domain id=16 name='instance-0000001e' uuid=b484b5f7-0814-4161-b492-633788f2961f is tainted: custom-monitor
Jan 22 17:20:19 np0005592767 nova_compute[182623]: 2026-01-22 22:20:19.175 182627 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 22 17:20:19 np0005592767 nova_compute[182623]: 2026-01-22 22:20:19.842 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:20 np0005592767 nova_compute[182623]: 2026-01-22 22:20:20.183 182627 INFO nova.virt.libvirt.driver [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 22 17:20:20 np0005592767 nova_compute[182623]: 2026-01-22 22:20:20.187 182627 DEBUG nova.compute.manager [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:20 np0005592767 nova_compute[182623]: 2026-01-22 22:20:20.203 182627 DEBUG nova.objects.instance [None req-104dac30-f1e2-437c-87a5-1feca3423b04 ef94e9ba7dec44189855a46feed21ab7 fe513118fd494504a1db1ead2b99d5ad - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:20:21 np0005592767 nova_compute[182623]: 2026-01-22 22:20:21.811 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.702 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.703 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.703 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.704 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.704 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.715 182627 INFO nova.compute.manager [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Terminating instance#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.725 182627 DEBUG nova.compute.manager [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:20:22 np0005592767 kernel: tap7e88b712-be (unregistering): left promiscuous mode
Jan 22 17:20:22 np0005592767 NetworkManager[54973]: <info>  [1769120422.7495] device (tap7e88b712-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.753 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00135|binding|INFO|Releasing lport 7e88b712-bef4-4434-b405-04af2a2d3d0f from this chassis (sb_readonly=0)
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00136|binding|INFO|Setting lport 7e88b712-bef4-4434-b405-04af2a2d3d0f down in Southbound
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00137|binding|INFO|Releasing lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 from this chassis (sb_readonly=0)
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00138|binding|INFO|Setting lport a87b1642-1ac3-4b35-809d-79c74a2f4e13 down in Southbound
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00139|binding|INFO|Removing iface tap7e88b712-be ovn-installed in OVS
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00140|binding|INFO|Releasing lport a18a2be2-f1a5-44ce-96ac-2c546dab3eef from this chassis (sb_readonly=0)
Jan 22 17:20:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:20:22Z|00141|binding|INFO|Releasing lport 65e3ee7d-2176-49c3-aeee-08b035ff0bbf from this chassis (sb_readonly=0)
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.764 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:07:e9 10.100.0.14'], port_security=['fa:16:3e:ad:07:e9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-850980191', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b484b5f7-0814-4161-b492-633788f2961f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-850980191', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57455e87-044b-404f-a524-0338a8363f01, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7e88b712-bef4-4434-b405-04af2a2d3d0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.765 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:a8:b5 19.80.0.59'], port_security=['fa:16:3e:ac:a8:b5 19.80.0.59'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['7e88b712-bef4-4434-b405-04af2a2d3d0f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1993138940', 'neutron:cidrs': '19.80.0.59/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1993138940', 'neutron:project_id': '9ead4241c55147dcbe136a6d6a69a60f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'df4b2ed1-2332-4fa7-acba-d6ab92d3ab25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=529a2a70-69a8-4f19-a951-f1c58852ecd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a87b1642-1ac3-4b35-809d-79c74a2f4e13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.766 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7e88b712-bef4-4434-b405-04af2a2d3d0f in datapath 698e77c5-fce6-47a5-b6e3-f4c56da226ea unbound from our chassis#033[00m
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.767 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 698e77c5-fce6-47a5-b6e3-f4c56da226ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.771 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3c70f8aa-8d4d-43a4-8ce8-58e18fa69d8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:22.771 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea namespace which is not needed anymore#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 22 17:20:22 np0005592767 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001e.scope: Consumed 4.268s CPU time.
Jan 22 17:20:22 np0005592767 systemd-machined[153912]: Machine qemu-16-instance-0000001e terminated.
Jan 22 17:20:22 np0005592767 podman[215506]: 2026-01-22 22:20:22.845911123 +0000 UTC m=+0.074515116 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.884 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [NOTICE]   (215418) : haproxy version is 2.8.14-c23fe91
Jan 22 17:20:22 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [NOTICE]   (215418) : path to executable is /usr/sbin/haproxy
Jan 22 17:20:22 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [WARNING]  (215418) : Exiting Master process...
Jan 22 17:20:22 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [ALERT]    (215418) : Current worker (215420) exited with code 143 (Terminated)
Jan 22 17:20:22 np0005592767 neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea[215414]: [WARNING]  (215418) : All workers exited. Exiting... (0)
Jan 22 17:20:22 np0005592767 systemd[1]: libpod-ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644.scope: Deactivated successfully.
Jan 22 17:20:22 np0005592767 podman[215545]: 2026-01-22 22:20:22.901864671 +0000 UTC m=+0.047335610 container died ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:20:22 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644-userdata-shm.mount: Deactivated successfully.
Jan 22 17:20:22 np0005592767 systemd[1]: var-lib-containers-storage-overlay-bb6bcc7caa66b6759c18b33e929aa2ee195344d82726a02fc6fd42087e216feb-merged.mount: Deactivated successfully.
Jan 22 17:20:22 np0005592767 podman[215545]: 2026-01-22 22:20:22.935626157 +0000 UTC m=+0.081097086 container cleanup ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:20:22 np0005592767 systemd[1]: libpod-conmon-ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644.scope: Deactivated successfully.
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.951 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.955 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.988 182627 INFO nova.virt.libvirt.driver [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] Instance destroyed successfully.#033[00m
Jan 22 17:20:22 np0005592767 nova_compute[182623]: 2026-01-22 22:20:22.990 182627 DEBUG nova.objects.instance [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lazy-loading 'resources' on Instance uuid b484b5f7-0814-4161-b492-633788f2961f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.002 182627 DEBUG nova.virt.libvirt.vif [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1962651622',display_name='tempest-LiveMigrationTest-server-1962651622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1962651622',id=30,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9ead4241c55147dcbe136a6d6a69a60f',ramdisk_id='',reservation_id='r-rvizwmjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-652633664',owner_user_name='tempest-LiveMigrationTest-652633664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:20:20Z,user_data=None,user_id='06b4b3807dc64d83b8bfbbf0c4d31d77',uuid=b484b5f7-0814-4161-b492-633788f2961f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.003 182627 DEBUG nova.network.os_vif_util [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converting VIF {"id": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "address": "fa:16:3e:ad:07:e9", "network": {"id": "698e77c5-fce6-47a5-b6e3-f4c56da226ea", "bridge": "br-int", "label": "tempest-LiveMigrationTest-225519230-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ead4241c55147dcbe136a6d6a69a60f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e88b712-be", "ovs_interfaceid": "7e88b712-bef4-4434-b405-04af2a2d3d0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.004 182627 DEBUG nova.network.os_vif_util [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.005 182627 DEBUG os_vif [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:20:23 np0005592767 podman[215576]: 2026-01-22 22:20:23.008157014 +0000 UTC m=+0.044361033 container remove ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.008 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.008 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e88b712-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.015 182627 INFO os_vif [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:07:e9,bridge_name='br-int',has_traffic_filtering=True,id=7e88b712-bef4-4434-b405-04af2a2d3d0f,network=Network(698e77c5-fce6-47a5-b6e3-f4c56da226ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7e88b712-be')#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.015 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18223313-33fc-4f96-8832-ffed5087ed17]: (4, ('Thu Jan 22 10:20:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644)\nba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644\nThu Jan 22 10:20:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea (ba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644)\nba71ff501824f0a346d432232d08cdeee92a24cc0104d7a9e006c286631b2644\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.016 182627 INFO nova.virt.libvirt.driver [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Deleting instance files /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f_del#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.016 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9cde453e-44b3-4666-8f3f-83618a7c3b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.017 182627 INFO nova.virt.libvirt.driver [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Deletion of /var/lib/nova/instances/b484b5f7-0814-4161-b492-633788f2961f_del complete#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.017 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap698e77c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:23 np0005592767 kernel: tap698e77c5-f0: left promiscuous mode
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.032 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4958e4a8-5d69-4d1f-8241-66cc28f745cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.054 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17fdf4b3-35ff-40ed-af12-90f3e65c5e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.056 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61860609-0d9d-46b5-b24d-059fd5f343f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.073 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3f2567-43a7-49a5-a3ea-2694acd89d19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401148, 'reachable_time': 33240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215603, 'error': None, 'target': 'ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.076 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-698e77c5-fce6-47a5-b6e3-f4c56da226ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.076 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1aa81a-99cb-4fbe-a4b9-d2f7790d34a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.077 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a87b1642-1ac3-4b35-809d-79c74a2f4e13 in datapath 75073b6a-f711-4d82-9e11-07cd8a1d16e2 unbound from our chassis#033[00m
Jan 22 17:20:23 np0005592767 systemd[1]: run-netns-ovnmeta\x2d698e77c5\x2dfce6\x2d47a5\x2db6e3\x2df4c56da226ea.mount: Deactivated successfully.
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.078 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75073b6a-f711-4d82-9e11-07cd8a1d16e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.078 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ad27a585-c684-45ee-88ef-7bbe3995abab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.079 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 namespace which is not needed anymore#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.096 182627 DEBUG nova.compute.manager [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.097 182627 DEBUG oslo_concurrency.lockutils [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.097 182627 DEBUG oslo_concurrency.lockutils [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.098 182627 DEBUG oslo_concurrency.lockutils [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.098 182627 DEBUG nova.compute.manager [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.099 182627 DEBUG nova.compute.manager [req-d5a8ea4b-5de3-4579-9bcc-a277a48457bd req-d5732396-a010-4969-ade5-2786de138c8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-unplugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.106 182627 INFO nova.compute.manager [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.107 182627 DEBUG oslo.service.loopingcall [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.108 182627 DEBUG nova.compute.manager [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.108 182627 DEBUG nova.network.neutron [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:20:23 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [NOTICE]   (215493) : haproxy version is 2.8.14-c23fe91
Jan 22 17:20:23 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [NOTICE]   (215493) : path to executable is /usr/sbin/haproxy
Jan 22 17:20:23 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [WARNING]  (215493) : Exiting Master process...
Jan 22 17:20:23 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [ALERT]    (215493) : Current worker (215495) exited with code 143 (Terminated)
Jan 22 17:20:23 np0005592767 neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2[215489]: [WARNING]  (215493) : All workers exited. Exiting... (0)
Jan 22 17:20:23 np0005592767 systemd[1]: libpod-9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9.scope: Deactivated successfully.
Jan 22 17:20:23 np0005592767 podman[215621]: 2026-01-22 22:20:23.210083533 +0000 UTC m=+0.050161481 container died 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:20:23 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9-userdata-shm.mount: Deactivated successfully.
Jan 22 17:20:23 np0005592767 systemd[1]: var-lib-containers-storage-overlay-c8a1445b690dd10760714c466bbc6c46cc311e4ced1cddf4037fcc8d9cf4124f-merged.mount: Deactivated successfully.
Jan 22 17:20:23 np0005592767 podman[215621]: 2026-01-22 22:20:23.244782797 +0000 UTC m=+0.084860755 container cleanup 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:20:23 np0005592767 systemd[1]: libpod-conmon-9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9.scope: Deactivated successfully.
Jan 22 17:20:23 np0005592767 podman[215652]: 2026-01-22 22:20:23.307379217 +0000 UTC m=+0.043788657 container remove 9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.312 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1de441a0-955b-4acf-9164-7fde39fe5fad]: (4, ('Thu Jan 22 10:20:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 (9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9)\n9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9\nThu Jan 22 10:20:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 (9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9)\n9b235aa760fb1822971b5d7fcbf673ebf721f7ca3703e141051dff328bbd06e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.314 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[92623723-fffa-453f-901a-b841825e6fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.315 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75073b6a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.317 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 kernel: tap75073b6a-f0: left promiscuous mode
Jan 22 17:20:23 np0005592767 nova_compute[182623]: 2026-01-22 22:20:23.334 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.337 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c21cb5cf-82bf-4ece-8fcd-fc8b71104e48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.363 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5449e15e-2dd2-467e-bb32-be91fbd7d9e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.364 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e40714-a86c-46be-a5c4-7b0f6b7f6f0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.379 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bf258dca-a3e4-4ae1-ada5-a4d78a6aff3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401233, 'reachable_time': 15332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215666, 'error': None, 'target': 'ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.381 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75073b6a-f711-4d82-9e11-07cd8a1d16e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:20:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:23.381 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c02b55-4e28-4853-b1b2-471c5f77518b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:20:23 np0005592767 systemd[1]: run-netns-ovnmeta\x2d75073b6a\x2df711\x2d4d82\x2d9e11\x2d07cd8a1d16e2.mount: Deactivated successfully.
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.843 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.853 182627 DEBUG nova.network.neutron [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.871 182627 INFO nova.compute.manager [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] Took 1.76 seconds to deallocate network for instance.#033[00m
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.984 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.984 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:24 np0005592767 nova_compute[182623]: 2026-01-22 22:20:24.989 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.039 182627 INFO nova.scheduler.client.report [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Deleted allocations for instance b484b5f7-0814-4161-b492-633788f2961f#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.108 182627 DEBUG oslo_concurrency.lockutils [None req-2f8fe731-7055-43e8-92d9-f5f539ff2817 06b4b3807dc64d83b8bfbbf0c4d31d77 9ead4241c55147dcbe136a6d6a69a60f - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.215 182627 DEBUG nova.compute.manager [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.216 182627 DEBUG oslo_concurrency.lockutils [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b484b5f7-0814-4161-b492-633788f2961f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.216 182627 DEBUG oslo_concurrency.lockutils [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.216 182627 DEBUG oslo_concurrency.lockutils [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b484b5f7-0814-4161-b492-633788f2961f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.216 182627 DEBUG nova.compute.manager [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] No waiting events found dispatching network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:20:25 np0005592767 nova_compute[182623]: 2026-01-22 22:20:25.217 182627 WARNING nova.compute.manager [req-8966657c-a656-41cf-8ad9-976f9ccb22a0 req-35b0ca82-a1e9-4d3d-852a-6df4a45d7d5b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b484b5f7-0814-4161-b492-633788f2961f] Received unexpected event network-vif-plugged-7e88b712-bef4-4434-b405-04af2a2d3d0f for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:20:27 np0005592767 podman[215668]: 2026-01-22 22:20:27.143230378 +0000 UTC m=+0.057657769 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public)
Jan 22 17:20:27 np0005592767 podman[215667]: 2026-01-22 22:20:27.166221552 +0000 UTC m=+0.082810825 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:20:28 np0005592767 nova_compute[182623]: 2026-01-22 22:20:28.011 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:29 np0005592767 nova_compute[182623]: 2026-01-22 22:20:29.896 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:32.125 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:20:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:32.126 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:20:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:20:32.127 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:20:32 np0005592767 nova_compute[182623]: 2026-01-22 22:20:32.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:33 np0005592767 nova_compute[182623]: 2026-01-22 22:20:33.013 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:33 np0005592767 podman[215712]: 2026-01-22 22:20:33.122840931 +0000 UTC m=+0.044052085 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:20:34 np0005592767 podman[215732]: 2026-01-22 22:20:34.161477105 +0000 UTC m=+0.082931509 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:20:34 np0005592767 nova_compute[182623]: 2026-01-22 22:20:34.946 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:37 np0005592767 nova_compute[182623]: 2026-01-22 22:20:37.987 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120422.9861712, b484b5f7-0814-4161-b492-633788f2961f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:37 np0005592767 nova_compute[182623]: 2026-01-22 22:20:37.987 182627 INFO nova.compute.manager [-] [instance: b484b5f7-0814-4161-b492-633788f2961f] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:20:38 np0005592767 nova_compute[182623]: 2026-01-22 22:20:38.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:38 np0005592767 nova_compute[182623]: 2026-01-22 22:20:38.027 182627 DEBUG nova.compute.manager [None req-84131f32-65ab-4116-aef9-eda34c4d9664 - - - - - -] [instance: b484b5f7-0814-4161-b492-633788f2961f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:39 np0005592767 nova_compute[182623]: 2026-01-22 22:20:39.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:42 np0005592767 podman[215757]: 2026-01-22 22:20:42.149128074 +0000 UTC m=+0.072454526 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:20:43 np0005592767 nova_compute[182623]: 2026-01-22 22:20:43.019 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:45 np0005592767 nova_compute[182623]: 2026-01-22 22:20:45.004 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.172 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.172 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.238 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.338 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.339 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.346 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.346 182627 INFO nova.compute.claims [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.438 182627 DEBUG nova.scheduler.client.report [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.466 182627 DEBUG nova.scheduler.client.report [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.467 182627 DEBUG nova.compute.provider_tree [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.482 182627 DEBUG nova.scheduler.client.report [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.514 182627 DEBUG nova.scheduler.client.report [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.564 182627 DEBUG nova.compute.provider_tree [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.585 182627 DEBUG nova.scheduler.client.report [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.609 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.610 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.664 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.665 182627 DEBUG nova.network.neutron [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.689 182627 INFO nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.711 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.864 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.866 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.867 182627 INFO nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Creating image(s)#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.868 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.869 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.870 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.896 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.962 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.964 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.964 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:47 np0005592767 nova_compute[182623]: 2026-01-22 22:20:47.975 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.004 182627 DEBUG nova.network.neutron [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.004 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.022 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.032 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.033 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.061 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.062 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.063 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.114 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.115 182627 DEBUG nova.virt.disk.api [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Checking if we can resize image /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.115 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.172 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.173 182627 DEBUG nova.virt.disk.api [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Cannot resize image /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.173 182627 DEBUG nova.objects.instance [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 9f6d1141-b32d-484c-a80c-67f37fe7193c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.190 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.191 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Ensure instance console log exists: /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.191 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.192 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.192 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.193 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.198 182627 WARNING nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.202 182627 DEBUG nova.virt.libvirt.host [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.202 182627 DEBUG nova.virt.libvirt.host [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.204 182627 DEBUG nova.virt.libvirt.host [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.205 182627 DEBUG nova.virt.libvirt.host [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.206 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.206 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.206 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.207 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.207 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.207 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.207 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.207 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.208 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.208 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.208 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.208 182627 DEBUG nova.virt.hardware [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.212 182627 DEBUG nova.objects.instance [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9f6d1141-b32d-484c-a80c-67f37fe7193c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.226 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <uuid>9f6d1141-b32d-484c-a80c-67f37fe7193c</uuid>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <name>instance-00000021</name>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:name>tempest-LiveMigrationNegativeTest-server-257163097</nova:name>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:20:48</nova:creationTime>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:user uuid="29ca219cab074bc3becb6860a04e3ecf">tempest-LiveMigrationNegativeTest-797901675-project-member</nova:user>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:        <nova:project uuid="2185a2fffe8b48239d8173f906b862a5">tempest-LiveMigrationNegativeTest-797901675</nova:project>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="serial">9f6d1141-b32d-484c-a80c-67f37fe7193c</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="uuid">9f6d1141-b32d-484c-a80c-67f37fe7193c</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.config"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/console.log" append="off"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:20:48 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:20:48 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:20:48 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:20:48 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.277 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.277 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.278 182627 INFO nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Using config drive#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.738 182627 INFO nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Creating config drive at /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.config#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.744 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib76f_5e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:48 np0005592767 nova_compute[182623]: 2026-01-22 22:20:48.870 182627 DEBUG oslo_concurrency.processutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib76f_5e" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:48 np0005592767 systemd-machined[153912]: New machine qemu-17-instance-00000021.
Jan 22 17:20:48 np0005592767 systemd[1]: Started Virtual Machine qemu-17-instance-00000021.
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.266 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120449.2660654, 9f6d1141-b32d-484c-a80c-67f37fe7193c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.266 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.269 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.269 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.273 182627 INFO nova.virt.libvirt.driver [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance spawned successfully.#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.273 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.290 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.295 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.300 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.301 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.302 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.302 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.302 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.303 182627 DEBUG nova.virt.libvirt.driver [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.333 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.334 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120449.266159, 9f6d1141-b32d-484c-a80c-67f37fe7193c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.334 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.365 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.368 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.392 182627 INFO nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Took 1.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.393 182627 DEBUG nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.398 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.469 182627 INFO nova.compute.manager [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Took 2.17 seconds to build instance.#033[00m
Jan 22 17:20:49 np0005592767 nova_compute[182623]: 2026-01-22 22:20:49.485 182627 DEBUG oslo_concurrency.lockutils [None req-2a276367-ae1d-4815-9936-a582fb2be40b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:50 np0005592767 nova_compute[182623]: 2026-01-22 22:20:50.006 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:53 np0005592767 podman[215825]: 2026-01-22 22:20:53.137024452 +0000 UTC m=+0.050866392 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.709 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "f049836d-0d89-4320-a2ff-113aee8162c4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.709 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.730 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.853 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.853 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.858 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:20:53 np0005592767 nova_compute[182623]: 2026-01-22 22:20:53.859 182627 INFO nova.compute.claims [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.037 182627 DEBUG nova.compute.provider_tree [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.051 182627 DEBUG nova.scheduler.client.report [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.071 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.072 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.141 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.141 182627 DEBUG nova.network.neutron [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.156 182627 INFO nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.183 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.327 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.328 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.328 182627 INFO nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Creating image(s)#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.329 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.329 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.330 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.341 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.416 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.417 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.418 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.428 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.496 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.497 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.533 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.534 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.535 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.585 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.586 182627 DEBUG nova.virt.disk.api [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Checking if we can resize image /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.587 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.639 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.640 182627 DEBUG nova.virt.disk.api [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Cannot resize image /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.640 182627 DEBUG nova.objects.instance [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'migration_context' on Instance uuid f049836d-0d89-4320-a2ff-113aee8162c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.652 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.652 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Ensure instance console log exists: /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.653 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.653 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.653 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.738 182627 DEBUG nova.network.neutron [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.739 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.741 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.744 182627 WARNING nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.750 182627 DEBUG nova.virt.libvirt.host [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.751 182627 DEBUG nova.virt.libvirt.host [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.762 182627 DEBUG nova.virt.libvirt.host [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.763 182627 DEBUG nova.virt.libvirt.host [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.764 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.764 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.765 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.765 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.765 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.765 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.765 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.766 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.766 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.766 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.766 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.766 182627 DEBUG nova.virt.hardware [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.770 182627 DEBUG nova.objects.instance [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid f049836d-0d89-4320-a2ff-113aee8162c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.784 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <uuid>f049836d-0d89-4320-a2ff-113aee8162c4</uuid>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <name>instance-00000024</name>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:name>tempest-LiveMigrationNegativeTest-server-143351499</nova:name>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:20:54</nova:creationTime>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:user uuid="29ca219cab074bc3becb6860a04e3ecf">tempest-LiveMigrationNegativeTest-797901675-project-member</nova:user>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:        <nova:project uuid="2185a2fffe8b48239d8173f906b862a5">tempest-LiveMigrationNegativeTest-797901675</nova:project>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="serial">f049836d-0d89-4320-a2ff-113aee8162c4</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="uuid">f049836d-0d89-4320-a2ff-113aee8162c4</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.config"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/console.log" append="off"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:20:54 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:20:54 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:20:54 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:20:54 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.828 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.828 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:20:54 np0005592767 nova_compute[182623]: 2026-01-22 22:20:54.829 182627 INFO nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Using config drive#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.007 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.025 182627 INFO nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Creating config drive at /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.config#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.030 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6d7pm05n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.154 182627 DEBUG oslo_concurrency.processutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6d7pm05n" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:20:55 np0005592767 systemd-machined[153912]: New machine qemu-18-instance-00000024.
Jan 22 17:20:55 np0005592767 systemd[1]: Started Virtual Machine qemu-18-instance-00000024.
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.623 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120455.6220741, f049836d-0d89-4320-a2ff-113aee8162c4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.624 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.626 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.627 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.630 182627 INFO nova.virt.libvirt.driver [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance spawned successfully.#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.631 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.654 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.657 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.665 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.665 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.666 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.666 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.666 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.667 182627 DEBUG nova.virt.libvirt.driver [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.690 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.691 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120455.6234574, f049836d-0d89-4320-a2ff-113aee8162c4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.691 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.714 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.716 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.869 182627 INFO nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Took 1.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.870 182627 DEBUG nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.909 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.974 182627 INFO nova.compute.manager [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Took 2.16 seconds to build instance.#033[00m
Jan 22 17:20:55 np0005592767 nova_compute[182623]: 2026-01-22 22:20:55.989 182627 DEBUG oslo_concurrency.lockutils [None req-e9def0ef-d485-4543-93f8-6939c58d366b 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.266 182627 DEBUG nova.objects.instance [None req-fcae23dd-1c93-447f-b239-cab80bbc6fa1 67c8c09b403642b7a59cc621710d4860 e13c9f18dc6645d8b9d004afcf596a14 - - default default] Lazy-loading 'pci_devices' on Instance uuid f049836d-0d89-4320-a2ff-113aee8162c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.286 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120457.2864041, f049836d-0d89-4320-a2ff-113aee8162c4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.287 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.307 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.311 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:20:57 np0005592767 nova_compute[182623]: 2026-01-22 22:20:57.328 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:20:57 np0005592767 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 22 17:20:57 np0005592767 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Consumed 2.176s CPU time.
Jan 22 17:20:57 np0005592767 systemd-machined[153912]: Machine qemu-18-instance-00000024 terminated.
Jan 22 17:20:57 np0005592767 podman[215892]: 2026-01-22 22:20:57.983669872 +0000 UTC m=+0.088925882 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Jan 22 17:20:57 np0005592767 podman[215891]: 2026-01-22 22:20:57.983754985 +0000 UTC m=+0.090091576 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:20:58 np0005592767 nova_compute[182623]: 2026-01-22 22:20:58.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:20:58 np0005592767 nova_compute[182623]: 2026-01-22 22:20:58.070 182627 DEBUG nova.compute.manager [None req-fcae23dd-1c93-447f-b239-cab80bbc6fa1 67c8c09b403642b7a59cc621710d4860 e13c9f18dc6645d8b9d004afcf596a14 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.009 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.307 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "f049836d-0d89-4320-a2ff-113aee8162c4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.307 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.308 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "f049836d-0d89-4320-a2ff-113aee8162c4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.308 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.308 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.320 182627 INFO nova.compute.manager [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Terminating instance#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.341 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "refresh_cache-f049836d-0d89-4320-a2ff-113aee8162c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.342 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquired lock "refresh_cache-f049836d-0d89-4320-a2ff-113aee8162c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.342 182627 DEBUG nova.network.neutron [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:21:00 np0005592767 nova_compute[182623]: 2026-01-22 22:21:00.647 182627 DEBUG nova.network.neutron [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:01 np0005592767 nova_compute[182623]: 2026-01-22 22:21:01.822 182627 DEBUG nova.network.neutron [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.745 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Releasing lock "refresh_cache-f049836d-0d89-4320-a2ff-113aee8162c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.746 182627 DEBUG nova.compute.manager [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.752 182627 INFO nova.virt.libvirt.driver [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance destroyed successfully.#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.752 182627 DEBUG nova.objects.instance [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'resources' on Instance uuid f049836d-0d89-4320-a2ff-113aee8162c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.766 182627 INFO nova.virt.libvirt.driver [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Deleting instance files /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4_del#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.767 182627 INFO nova.virt.libvirt.driver [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Deletion of /var/lib/nova/instances/f049836d-0d89-4320-a2ff-113aee8162c4_del complete#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.917 182627 INFO nova.compute.manager [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Took 0.17 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.918 182627 DEBUG oslo.service.loopingcall [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.918 182627 DEBUG nova.compute.manager [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:21:02 np0005592767 nova_compute[182623]: 2026-01-22 22:21:02.919 182627 DEBUG nova.network.neutron [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.118 182627 DEBUG nova.network.neutron [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.141 182627 DEBUG nova.network.neutron [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.160 182627 INFO nova.compute.manager [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Took 0.24 seconds to deallocate network for instance.#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.268 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.268 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.342 182627 DEBUG nova.compute.provider_tree [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.354 182627 DEBUG nova.scheduler.client.report [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.380 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.414 182627 INFO nova.scheduler.client.report [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Deleted allocations for instance f049836d-0d89-4320-a2ff-113aee8162c4#033[00m
Jan 22 17:21:03 np0005592767 nova_compute[182623]: 2026-01-22 22:21:03.485 182627 DEBUG oslo_concurrency.lockutils [None req-0aab0751-44a2-4c8b-aac4-73b4a86fca05 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "f049836d-0d89-4320-a2ff-113aee8162c4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:04 np0005592767 podman[215965]: 2026-01-22 22:21:04.144241203 +0000 UTC m=+0.058787409 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.158 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.159 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.159 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "9f6d1141-b32d-484c-a80c-67f37fe7193c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.159 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.159 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.170 182627 INFO nova.compute.manager [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Terminating instance#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.184 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "refresh_cache-9f6d1141-b32d-484c-a80c-67f37fe7193c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.185 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquired lock "refresh_cache-9f6d1141-b32d-484c-a80c-67f37fe7193c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.185 182627 DEBUG nova.network.neutron [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.403 182627 DEBUG nova.network.neutron [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.902 182627 DEBUG nova.network.neutron [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.926 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Releasing lock "refresh_cache-9f6d1141-b32d-484c-a80c-67f37fe7193c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:21:04 np0005592767 nova_compute[182623]: 2026-01-22 22:21:04.927 182627 DEBUG nova.compute.manager [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:21:04 np0005592767 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 22 17:21:04 np0005592767 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000021.scope: Consumed 11.747s CPU time.
Jan 22 17:21:04 np0005592767 systemd-machined[153912]: Machine qemu-17-instance-00000021 terminated.
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.011 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:05 np0005592767 podman[215984]: 2026-01-22 22:21:05.02600445 +0000 UTC m=+0.060805487 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.175 182627 INFO nova.virt.libvirt.driver [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance destroyed successfully.#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.176 182627 DEBUG nova.objects.instance [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lazy-loading 'resources' on Instance uuid 9f6d1141-b32d-484c-a80c-67f37fe7193c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.187 182627 INFO nova.virt.libvirt.driver [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Deleting instance files /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c_del#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.188 182627 INFO nova.virt.libvirt.driver [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Deletion of /var/lib/nova/instances/9f6d1141-b32d-484c-a80c-67f37fe7193c_del complete#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.260 182627 INFO nova.compute.manager [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.261 182627 DEBUG oslo.service.loopingcall [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.261 182627 DEBUG nova.compute.manager [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.261 182627 DEBUG nova.network.neutron [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.508 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.539 182627 DEBUG nova.network.neutron [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.551 182627 DEBUG nova.network.neutron [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.563 182627 INFO nova.compute.manager [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Took 0.30 seconds to deallocate network for instance.#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.657 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.657 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.853 182627 DEBUG nova.compute.provider_tree [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.874 182627 DEBUG nova.scheduler.client.report [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:21:05Z|00142|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.897 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:05 np0005592767 nova_compute[182623]: 2026-01-22 22:21:05.924 182627 INFO nova.scheduler.client.report [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Deleted allocations for instance 9f6d1141-b32d-484c-a80c-67f37fe7193c#033[00m
Jan 22 17:21:06 np0005592767 nova_compute[182623]: 2026-01-22 22:21:06.041 182627 DEBUG oslo_concurrency.lockutils [None req-23b53982-5ca8-4035-b0a7-8dc9a2787ed4 29ca219cab074bc3becb6860a04e3ecf 2185a2fffe8b48239d8173f906b862a5 - - default default] Lock "9f6d1141-b32d-484c-a80c-67f37fe7193c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:07 np0005592767 nova_compute[182623]: 2026-01-22 22:21:07.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:08 np0005592767 nova_compute[182623]: 2026-01-22 22:21:08.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:08 np0005592767 nova_compute[182623]: 2026-01-22 22:21:08.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:08 np0005592767 nova_compute[182623]: 2026-01-22 22:21:08.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:08 np0005592767 nova_compute[182623]: 2026-01-22 22:21:08.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:09 np0005592767 nova_compute[182623]: 2026-01-22 22:21:09.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:09 np0005592767 nova_compute[182623]: 2026-01-22 22:21:09.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:21:09 np0005592767 nova_compute[182623]: 2026-01-22 22:21:09.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.519 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.911 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:10 np0005592767 nova_compute[182623]: 2026-01-22 22:21:10.912 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:21:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:12.093 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:12.094 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:12.094 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:12 np0005592767 nova_compute[182623]: 2026-01-22 22:21:12.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:21:12 np0005592767 nova_compute[182623]: 2026-01-22 22:21:12.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:12 np0005592767 nova_compute[182623]: 2026-01-22 22:21:12.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:12 np0005592767 nova_compute[182623]: 2026-01-22 22:21:12.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:12 np0005592767 nova_compute[182623]: 2026-01-22 22:21:12.922 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:21:13 np0005592767 podman[216018]: 2026-01-22 22:21:13.02474573 +0000 UTC m=+0.065354466 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.042 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.072 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120458.0703995, f049836d-0d89-4320-a2ff-113aee8162c4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.072 182627 INFO nova.compute.manager [-] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.078 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.078 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5717MB free_disk=73.27470779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.079 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.079 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.096 182627 DEBUG nova.compute.manager [None req-c0d4ba3c-3aa6-48fa-a80c-56837e6ca038 - - - - - -] [instance: f049836d-0d89-4320-a2ff-113aee8162c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.143 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.144 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.168 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.184 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.207 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:21:13 np0005592767 nova_compute[182623]: 2026-01-22 22:21:13.208 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:15 np0005592767 nova_compute[182623]: 2026-01-22 22:21:15.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.069 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.070 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.092 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.216 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.217 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.224 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.224 182627 INFO nova.compute.claims [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.380 182627 DEBUG nova.compute.provider_tree [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.402 182627 DEBUG nova.scheduler.client.report [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.432 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.433 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.515 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.516 182627 DEBUG nova.network.neutron [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.542 182627 INFO nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.563 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.708 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.710 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.711 182627 INFO nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Creating image(s)#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.712 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.712 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.714 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.745 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.802 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.803 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.804 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.818 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.873 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.875 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.901 182627 DEBUG nova.network.neutron [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.902 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.923 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.924 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:16 np0005592767 nova_compute[182623]: 2026-01-22 22:21:16.925 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.013 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.015 182627 DEBUG nova.virt.disk.api [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Checking if we can resize image /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.016 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.107 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.109 182627 DEBUG nova.virt.disk.api [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Cannot resize image /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.109 182627 DEBUG nova.objects.instance [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.160 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.161 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Ensure instance console log exists: /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.161 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.162 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.162 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.165 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.170 182627 WARNING nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.175 182627 DEBUG nova.virt.libvirt.host [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.176 182627 DEBUG nova.virt.libvirt.host [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.180 182627 DEBUG nova.virt.libvirt.host [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.181 182627 DEBUG nova.virt.libvirt.host [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.184 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.185 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.186 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.187 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.188 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.188 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.189 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.189 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.190 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.191 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.192 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.192 182627 DEBUG nova.virt.hardware [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.202 182627 DEBUG nova.objects.instance [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.224 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <uuid>79166459-7b8b-44ed-8dba-0ba4cb9d97ff</uuid>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <name>instance-00000025</name>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:name>tempest-MigrationsAdminTest-server-1393496925</nova:name>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:21:17</nova:creationTime>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:        <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="serial">79166459-7b8b-44ed-8dba-0ba4cb9d97ff</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="uuid">79166459-7b8b-44ed-8dba-0ba4cb9d97ff</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/console.log" append="off"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:21:17 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:21:17 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:21:17 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:21:17 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.264 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.265 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.265 182627 INFO nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Using config drive#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.471 182627 INFO nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Creating config drive at /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.482 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfv_weiw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:17 np0005592767 nova_compute[182623]: 2026-01-22 22:21:17.625 182627 DEBUG oslo_concurrency.processutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfv_weiw" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:17 np0005592767 systemd-machined[153912]: New machine qemu-19-instance-00000025.
Jan 22 17:21:17 np0005592767 systemd[1]: Started Virtual Machine qemu-19-instance-00000025.
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.044 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.336 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120478.3360372, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.339 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.343 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.344 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.348 182627 INFO nova.virt.libvirt.driver [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance spawned successfully.#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.348 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.366 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.371 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.375 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.375 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.376 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.376 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.376 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.377 182627 DEBUG nova.virt.libvirt.driver [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.399 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.400 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120478.3421314, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.400 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Started (Lifecycle Event)#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.432 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.435 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.453 182627 INFO nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Took 1.75 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.454 182627 DEBUG nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.459 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.523 182627 INFO nova.compute.manager [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Took 2.35 seconds to build instance.#033[00m
Jan 22 17:21:18 np0005592767 nova_compute[182623]: 2026-01-22 22:21:18.540 182627 DEBUG oslo_concurrency.lockutils [None req-42a8ad9a-1af7-4b7e-ab07-2c7e6b64bbd1 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:20 np0005592767 nova_compute[182623]: 2026-01-22 22:21:20.032 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:20 np0005592767 nova_compute[182623]: 2026-01-22 22:21:20.174 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120465.1732585, 9f6d1141-b32d-484c-a80c-67f37fe7193c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:21:20 np0005592767 nova_compute[182623]: 2026-01-22 22:21:20.174 182627 INFO nova.compute.manager [-] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:21:20 np0005592767 nova_compute[182623]: 2026-01-22 22:21:20.194 182627 DEBUG nova.compute.manager [None req-44e1112e-007b-4824-838f-0b9ce1bbc552 - - - - - -] [instance: 9f6d1141-b32d-484c-a80c-67f37fe7193c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:22 np0005592767 nova_compute[182623]: 2026-01-22 22:21:22.983 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:21:22 np0005592767 nova_compute[182623]: 2026-01-22 22:21:22.984 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquired lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:21:22 np0005592767 nova_compute[182623]: 2026-01-22 22:21:22.985 182627 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:21:23 np0005592767 nova_compute[182623]: 2026-01-22 22:21:23.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:23 np0005592767 nova_compute[182623]: 2026-01-22 22:21:23.174 182627 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:23 np0005592767 nova_compute[182623]: 2026-01-22 22:21:23.979 182627 DEBUG nova.network.neutron [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:23 np0005592767 nova_compute[182623]: 2026-01-22 22:21:23.996 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Releasing lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.115 182627 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.117 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Creating file /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/44635b57fe244a69ab80ea6dc22f329f.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.117 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/44635b57fe244a69ab80ea6dc22f329f.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:24 np0005592767 podman[216086]: 2026-01-22 22:21:24.186512414 +0000 UTC m=+0.095566254 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.612 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/44635b57fe244a69ab80ea6dc22f329f.tmp" returned: 1 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.613 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/44635b57fe244a69ab80ea6dc22f329f.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.613 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Creating directory /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.614 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.846 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:24 np0005592767 nova_compute[182623]: 2026-01-22 22:21:24.850 182627 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:21:25 np0005592767 nova_compute[182623]: 2026-01-22 22:21:25.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:28 np0005592767 nova_compute[182623]: 2026-01-22 22:21:28.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:28 np0005592767 podman[216109]: 2026-01-22 22:21:28.153954428 +0000 UTC m=+0.065936302 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal)
Jan 22 17:21:28 np0005592767 podman[216108]: 2026-01-22 22:21:28.192742009 +0000 UTC m=+0.108942033 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:21:30 np0005592767 nova_compute[182623]: 2026-01-22 22:21:30.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:33 np0005592767 nova_compute[182623]: 2026-01-22 22:21:33.054 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:34.217 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:21:34 np0005592767 nova_compute[182623]: 2026-01-22 22:21:34.217 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:34.219 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:21:34 np0005592767 nova_compute[182623]: 2026-01-22 22:21:34.895 182627 DEBUG nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:21:35 np0005592767 nova_compute[182623]: 2026-01-22 22:21:35.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:35 np0005592767 podman[216168]: 2026-01-22 22:21:35.163292659 +0000 UTC m=+0.071137690 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:21:35 np0005592767 podman[216167]: 2026-01-22 22:21:35.170417041 +0000 UTC m=+0.073523968 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:21:37 np0005592767 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 22 17:21:37 np0005592767 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000025.scope: Consumed 12.100s CPU time.
Jan 22 17:21:37 np0005592767 systemd-machined[153912]: Machine qemu-19-instance-00000025 terminated.
Jan 22 17:21:37 np0005592767 nova_compute[182623]: 2026-01-22 22:21:37.909 182627 INFO nova.virt.libvirt.driver [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:21:37 np0005592767 nova_compute[182623]: 2026-01-22 22:21:37.916 182627 INFO nova.virt.libvirt.driver [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance destroyed successfully.#033[00m
Jan 22 17:21:37 np0005592767 nova_compute[182623]: 2026-01-22 22:21:37.923 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:37 np0005592767 nova_compute[182623]: 2026-01-22 22:21:37.986 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:37 np0005592767 nova_compute[182623]: 2026-01-22 22:21:37.988 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:38 np0005592767 nova_compute[182623]: 2026-01-22 22:21:38.044 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:38 np0005592767 nova_compute[182623]: 2026-01-22 22:21:38.047 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk to 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:21:38 np0005592767 nova_compute[182623]: 2026-01-22 22:21:38.048 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:38 np0005592767 nova_compute[182623]: 2026-01-22 22:21:38.068 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.014 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk" returned: 0 in 0.966s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.015 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.016 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.config 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.315 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -C -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.config 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.config" returned: 0 in 0.300s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.317 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Copying file /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.318 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.info 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.561 182627 DEBUG oslo_concurrency.processutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "scp -C -r /var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff_resize/disk.info 192.168.122.100:/var/lib/nova/instances/79166459-7b8b-44ed-8dba-0ba4cb9d97ff/disk.info" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.725 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.726 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:39 np0005592767 nova_compute[182623]: 2026-01-22 22:21:39.726 182627 DEBUG oslo_concurrency.lockutils [None req-30026b50-2cea-4fbb-a7cc-d8a3b1ad8533 f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:40 np0005592767 nova_compute[182623]: 2026-01-22 22:21:40.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:21:40.221 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:21:43 np0005592767 nova_compute[182623]: 2026-01-22 22:21:43.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:43 np0005592767 podman[216230]: 2026-01-22 22:21:43.167134454 +0000 UTC m=+0.063590196 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.022 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.022 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.022 182627 DEBUG nova.compute.manager [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Going to confirm migration 7 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.075 182627 DEBUG nova.objects.instance [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'info_cache' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.858 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.859 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:21:44 np0005592767 nova_compute[182623]: 2026-01-22 22:21:44.860 182627 DEBUG nova.network.neutron [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.042 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.302 182627 DEBUG nova.network.neutron [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.572 182627 DEBUG nova.network.neutron [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.584 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-79166459-7b8b-44ed-8dba-0ba4cb9d97ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.584 182627 DEBUG nova.objects.instance [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid 79166459-7b8b-44ed-8dba-0ba4cb9d97ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.610 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.610 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.676 182627 DEBUG nova.compute.provider_tree [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.689 182627 DEBUG nova.scheduler.client.report [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.730 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:45 np0005592767 nova_compute[182623]: 2026-01-22 22:21:45.910 182627 INFO nova.scheduler.client.report [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Deleted allocation for migration 4ec7d364-40db-44a4-8d88-b751eac5fa23#033[00m
Jan 22 17:21:46 np0005592767 nova_compute[182623]: 2026-01-22 22:21:46.022 182627 DEBUG oslo_concurrency.lockutils [None req-5ea98d9c-32b2-4a2b-9050-7108671e4620 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "79166459-7b8b-44ed-8dba-0ba4cb9d97ff" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:48 np0005592767 nova_compute[182623]: 2026-01-22 22:21:48.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:50 np0005592767 nova_compute[182623]: 2026-01-22 22:21:50.045 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:52 np0005592767 nova_compute[182623]: 2026-01-22 22:21:52.331 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120497.3304183, 79166459-7b8b-44ed-8dba-0ba4cb9d97ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:21:52 np0005592767 nova_compute[182623]: 2026-01-22 22:21:52.332 182627 INFO nova.compute.manager [-] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:21:52 np0005592767 nova_compute[182623]: 2026-01-22 22:21:52.353 182627 DEBUG nova.compute.manager [None req-7483a5ac-fde5-4980-85f7-0434e3f01bc9 - - - - - -] [instance: 79166459-7b8b-44ed-8dba-0ba4cb9d97ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:21:53 np0005592767 nova_compute[182623]: 2026-01-22 22:21:53.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.688 182627 DEBUG nova.compute.manager [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.814 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.815 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.834 182627 DEBUG nova.objects.instance [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_requests' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.852 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.852 182627 INFO nova.compute.claims [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.853 182627 DEBUG nova.objects.instance [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.866 182627 DEBUG nova.objects.instance [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_devices' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.930 182627 INFO nova.compute.resource_tracker [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating resource usage from migration 90e6c5da-445b-4739-932f-03c7172db2c7#033[00m
Jan 22 17:21:54 np0005592767 nova_compute[182623]: 2026-01-22 22:21:54.930 182627 DEBUG nova.compute.resource_tracker [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Starting to track incoming migration 90e6c5da-445b-4739-932f-03c7172db2c7 with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:21:55 np0005592767 nova_compute[182623]: 2026-01-22 22:21:55.023 182627 DEBUG nova.compute.provider_tree [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:21:55 np0005592767 nova_compute[182623]: 2026-01-22 22:21:55.037 182627 DEBUG nova.scheduler.client.report [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:21:55 np0005592767 nova_compute[182623]: 2026-01-22 22:21:55.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:55 np0005592767 nova_compute[182623]: 2026-01-22 22:21:55.052 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:21:55 np0005592767 nova_compute[182623]: 2026-01-22 22:21:55.052 182627 INFO nova.compute.manager [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Migrating#033[00m
Jan 22 17:21:55 np0005592767 podman[216257]: 2026-01-22 22:21:55.163564028 +0000 UTC m=+0.072848999 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 17:21:56 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:21:56 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:21:56 np0005592767 systemd-logind[802]: New session 31 of user nova.
Jan 22 17:21:56 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:21:56 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:21:56 np0005592767 systemd[216282]: Queued start job for default target Main User Target.
Jan 22 17:21:56 np0005592767 systemd[216282]: Created slice User Application Slice.
Jan 22 17:21:56 np0005592767 systemd[216282]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:21:56 np0005592767 systemd[216282]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:21:56 np0005592767 systemd[216282]: Reached target Paths.
Jan 22 17:21:56 np0005592767 systemd[216282]: Reached target Timers.
Jan 22 17:21:56 np0005592767 systemd[216282]: Starting D-Bus User Message Bus Socket...
Jan 22 17:21:56 np0005592767 systemd[216282]: Starting Create User's Volatile Files and Directories...
Jan 22 17:21:56 np0005592767 systemd[216282]: Finished Create User's Volatile Files and Directories.
Jan 22 17:21:56 np0005592767 systemd[216282]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:21:56 np0005592767 systemd[216282]: Reached target Sockets.
Jan 22 17:21:56 np0005592767 systemd[216282]: Reached target Basic System.
Jan 22 17:21:56 np0005592767 systemd[216282]: Reached target Main User Target.
Jan 22 17:21:56 np0005592767 systemd[216282]: Startup finished in 147ms.
Jan 22 17:21:56 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:21:56 np0005592767 systemd[1]: Started Session 31 of User nova.
Jan 22 17:21:56 np0005592767 systemd[1]: session-31.scope: Deactivated successfully.
Jan 22 17:21:56 np0005592767 systemd-logind[802]: Session 31 logged out. Waiting for processes to exit.
Jan 22 17:21:56 np0005592767 systemd-logind[802]: Removed session 31.
Jan 22 17:21:56 np0005592767 systemd-logind[802]: New session 33 of user nova.
Jan 22 17:21:56 np0005592767 systemd[1]: Started Session 33 of User nova.
Jan 22 17:21:56 np0005592767 systemd[1]: session-33.scope: Deactivated successfully.
Jan 22 17:21:56 np0005592767 systemd-logind[802]: Session 33 logged out. Waiting for processes to exit.
Jan 22 17:21:56 np0005592767 systemd-logind[802]: Removed session 33.
Jan 22 17:21:58 np0005592767 nova_compute[182623]: 2026-01-22 22:21:58.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:21:59 np0005592767 podman[216307]: 2026-01-22 22:21:59.181813223 +0000 UTC m=+0.079773845 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350)
Jan 22 17:21:59 np0005592767 podman[216306]: 2026-01-22 22:21:59.209993683 +0000 UTC m=+0.108493610 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:22:00 np0005592767 nova_compute[182623]: 2026-01-22 22:22:00.048 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:03 np0005592767 nova_compute[182623]: 2026-01-22 22:22:03.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:05 np0005592767 nova_compute[182623]: 2026-01-22 22:22:05.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:06 np0005592767 podman[216355]: 2026-01-22 22:22:06.187652534 +0000 UTC m=+0.088878013 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:22:06 np0005592767 nova_compute[182623]: 2026-01-22 22:22:06.207 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:06 np0005592767 podman[216354]: 2026-01-22 22:22:06.210278227 +0000 UTC m=+0.117389763 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:22:06 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:22:06 np0005592767 systemd[216282]: Activating special unit Exit the Session...
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped target Main User Target.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped target Basic System.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped target Paths.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped target Sockets.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped target Timers.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:06 np0005592767 systemd[216282]: Closed D-Bus User Message Bus Socket.
Jan 22 17:22:06 np0005592767 systemd[216282]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:22:06 np0005592767 systemd[216282]: Removed slice User Application Slice.
Jan 22 17:22:06 np0005592767 systemd[216282]: Reached target Shutdown.
Jan 22 17:22:06 np0005592767 systemd[216282]: Finished Exit the Session.
Jan 22 17:22:06 np0005592767 systemd[216282]: Reached target Exit the Session.
Jan 22 17:22:06 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:22:06 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:22:06 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:22:06 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:22:06 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:22:06 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:22:06 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.288 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.288 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.303 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:22:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.402 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.403 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.413 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.414 182627 INFO nova.compute.claims [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.552 182627 DEBUG nova.compute.provider_tree [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.571 182627 DEBUG nova.scheduler.client.report [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.591 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.591 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.650 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.651 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.674 182627 INFO nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.692 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.808 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.809 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.809 182627 INFO nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Creating image(s)#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.810 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.810 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.811 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.822 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.864 182627 DEBUG nova.policy [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d9fe7f0e8b4edf92fa2064aaab8bca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3a2ee662fba426c8f688455b20759bf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.877 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.878 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.878 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.889 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.938 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.939 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.973 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.974 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:07 np0005592767 nova_compute[182623]: 2026-01-22 22:22:07.974 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.029 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.030 182627 DEBUG nova.virt.disk.api [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Checking if we can resize image /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.030 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.081 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.082 182627 DEBUG nova.virt.disk.api [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Cannot resize image /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.082 182627 DEBUG nova.objects.instance [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'migration_context' on Instance uuid 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.094 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.095 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Ensure instance console log exists: /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.095 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.095 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.096 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.126 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.684 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Successfully created port: d48f9605-219d-4393-b6ef-7d9c783e0a18 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:22:08 np0005592767 nova_compute[182623]: 2026-01-22 22:22:08.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:09 np0005592767 nova_compute[182623]: 2026-01-22 22:22:09.800 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Successfully updated port: d48f9605-219d-4393-b6ef-7d9c783e0a18 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:22:09 np0005592767 nova_compute[182623]: 2026-01-22 22:22:09.841 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:09 np0005592767 nova_compute[182623]: 2026-01-22 22:22:09.841 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquired lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:09 np0005592767 nova_compute[182623]: 2026-01-22 22:22:09.841 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:22:09 np0005592767 nova_compute[182623]: 2026-01-22 22:22:09.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.010 182627 DEBUG nova.compute.manager [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-changed-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.011 182627 DEBUG nova.compute.manager [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Refreshing instance network info cache due to event network-changed-d48f9605-219d-4393-b6ef-7d9c783e0a18. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.011 182627 DEBUG oslo_concurrency.lockutils [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:10 np0005592767 systemd-logind[802]: New session 34 of user nova.
Jan 22 17:22:10 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:22:10 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.069 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:22:10 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:22:10 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:22:10 np0005592767 systemd[216417]: Queued start job for default target Main User Target.
Jan 22 17:22:10 np0005592767 systemd[216417]: Created slice User Application Slice.
Jan 22 17:22:10 np0005592767 systemd[216417]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:10 np0005592767 systemd[216417]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:10 np0005592767 systemd[216417]: Reached target Paths.
Jan 22 17:22:10 np0005592767 systemd[216417]: Reached target Timers.
Jan 22 17:22:10 np0005592767 systemd[216417]: Starting D-Bus User Message Bus Socket...
Jan 22 17:22:10 np0005592767 systemd[216417]: Starting Create User's Volatile Files and Directories...
Jan 22 17:22:10 np0005592767 systemd[216417]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:22:10 np0005592767 systemd[216417]: Reached target Sockets.
Jan 22 17:22:10 np0005592767 systemd[216417]: Finished Create User's Volatile Files and Directories.
Jan 22 17:22:10 np0005592767 systemd[216417]: Reached target Basic System.
Jan 22 17:22:10 np0005592767 systemd[216417]: Reached target Main User Target.
Jan 22 17:22:10 np0005592767 systemd[216417]: Startup finished in 150ms.
Jan 22 17:22:10 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:22:10 np0005592767 systemd[1]: Started Session 34 of User nova.
Jan 22 17:22:10 np0005592767 systemd[1]: session-34.scope: Deactivated successfully.
Jan 22 17:22:10 np0005592767 systemd-logind[802]: Session 34 logged out. Waiting for processes to exit.
Jan 22 17:22:10 np0005592767 systemd-logind[802]: Removed session 34.
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.914 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.914 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:10 np0005592767 nova_compute[182623]: 2026-01-22 22:22:10.915 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:22:10 np0005592767 systemd-logind[802]: New session 36 of user nova.
Jan 22 17:22:10 np0005592767 systemd[1]: Started Session 36 of User nova.
Jan 22 17:22:11 np0005592767 systemd[1]: session-36.scope: Deactivated successfully.
Jan 22 17:22:11 np0005592767 systemd-logind[802]: Session 36 logged out. Waiting for processes to exit.
Jan 22 17:22:11 np0005592767 systemd-logind[802]: Removed session 36.
Jan 22 17:22:11 np0005592767 systemd-logind[802]: New session 37 of user nova.
Jan 22 17:22:11 np0005592767 systemd[1]: Started Session 37 of User nova.
Jan 22 17:22:11 np0005592767 systemd[1]: session-37.scope: Deactivated successfully.
Jan 22 17:22:11 np0005592767 systemd-logind[802]: Session 37 logged out. Waiting for processes to exit.
Jan 22 17:22:11 np0005592767 systemd-logind[802]: Removed session 37.
Jan 22 17:22:11 np0005592767 nova_compute[182623]: 2026-01-22 22:22:11.837 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:11 np0005592767 nova_compute[182623]: 2026-01-22 22:22:11.837 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:11 np0005592767 nova_compute[182623]: 2026-01-22 22:22:11.838 182627 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:22:11 np0005592767 nova_compute[182623]: 2026-01-22 22:22:11.990 182627 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.063 182627 DEBUG nova.network.neutron [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Updating instance_info_cache with network_info: [{"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.079 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Releasing lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.079 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Instance network_info: |[{"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.080 182627 DEBUG oslo_concurrency.lockutils [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.080 182627 DEBUG nova.network.neutron [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Refreshing network info cache for port d48f9605-219d-4393-b6ef-7d9c783e0a18 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.085 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Start _get_guest_xml network_info=[{"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.092 182627 WARNING nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:22:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:12.093 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:12.094 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:12.094 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.097 182627 DEBUG nova.virt.libvirt.host [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.098 182627 DEBUG nova.virt.libvirt.host [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.102 182627 DEBUG nova.virt.libvirt.host [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.103 182627 DEBUG nova.virt.libvirt.host [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.105 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.105 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.106 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.106 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.107 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.107 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.107 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.108 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.108 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.109 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.109 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.110 182627 DEBUG nova.virt.hardware [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.116 182627 DEBUG nova.virt.libvirt.vif [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2092297008',display_name='tempest-ImagesTestJSON-server-2092297008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2092297008',id=40,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ksigyrm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:07Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=3f7c3d83-9a90-4ca3-8206-22e36eef2e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.116 182627 DEBUG nova.network.os_vif_util [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.118 182627 DEBUG nova.network.os_vif_util [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.119 182627 DEBUG nova.objects.instance [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.138 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <uuid>3f7c3d83-9a90-4ca3-8206-22e36eef2e04</uuid>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <name>instance-00000028</name>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:name>tempest-ImagesTestJSON-server-2092297008</nova:name>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:22:12</nova:creationTime>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:user uuid="52d9fe7f0e8b4edf92fa2064aaab8bca">tempest-ImagesTestJSON-23148374-project-member</nova:user>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:project uuid="d3a2ee662fba426c8f688455b20759bf">tempest-ImagesTestJSON-23148374</nova:project>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:port uuid="d48f9605-219d-4393-b6ef-7d9c783e0a18">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="serial">3f7c3d83-9a90-4ca3-8206-22e36eef2e04</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="uuid">3f7c3d83-9a90-4ca3-8206-22e36eef2e04</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.config"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:1b:ac:f3"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <target dev="tapd48f9605-21"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/console.log" append="off"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:22:12 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:22:12 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.138 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Preparing to wait for external event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.139 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.140 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.140 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.141 182627 DEBUG nova.virt.libvirt.vif [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2092297008',display_name='tempest-ImagesTestJSON-server-2092297008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2092297008',id=40,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ksigyrm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:07Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=3f7c3d83-9a90-4ca3-8206-22e36eef2e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.142 182627 DEBUG nova.network.os_vif_util [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.143 182627 DEBUG nova.network.os_vif_util [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.144 182627 DEBUG os_vif [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.145 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.146 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.147 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd48f9605-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd48f9605-21, col_values=(('external_ids', {'iface-id': 'd48f9605-219d-4393-b6ef-7d9c783e0a18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:ac:f3', 'vm-uuid': '3f7c3d83-9a90-4ca3-8206-22e36eef2e04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:12 np0005592767 NetworkManager[54973]: <info>  [1769120532.1531] manager: (tapd48f9605-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.163 182627 INFO os_vif [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21')#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.231 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.231 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.232 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No VIF found with MAC fa:16:3e:1b:ac:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.233 182627 INFO nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Using config drive#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.310 182627 DEBUG nova.network.neutron [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.326 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.448 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.450 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.451 182627 INFO nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Creating image(s)#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.452 182627 DEBUG nova.objects.instance [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'trusted_certs' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.464 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.553 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.554 182627 DEBUG nova.virt.disk.api [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Checking if we can resize image /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.555 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.621 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.622 182627 DEBUG nova.virt.disk.api [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Cannot resize image /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.645 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.645 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Ensure instance console log exists: /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.646 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.647 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.647 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.650 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.655 182627 WARNING nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.660 182627 DEBUG nova.virt.libvirt.host [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.660 182627 DEBUG nova.virt.libvirt.host [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.663 182627 DEBUG nova.virt.libvirt.host [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.663 182627 DEBUG nova.virt.libvirt.host [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.664 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.665 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.665 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.665 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.666 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.666 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.666 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.666 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.666 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.667 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.667 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.667 182627 DEBUG nova.virt.hardware [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.667 182627 DEBUG nova.objects.instance [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.690 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.714 182627 INFO nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Creating config drive at /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.config#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.725 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdw6siu_3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.750 182627 DEBUG oslo_concurrency.processutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.751 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.752 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.753 182627 DEBUG oslo_concurrency.lockutils [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.756 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <uuid>ce913c81-c8b7-4b71-91b0-ec941d59dc1c</uuid>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <name>instance-00000027</name>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <memory>196608</memory>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:name>tempest-MigrationsAdminTest-server-1151892985</nova:name>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:22:12</nova:creationTime>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.micro">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:memory>192</nova:memory>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:        <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="serial">ce913c81-c8b7-4b71-91b0-ec941d59dc1c</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="uuid">ce913c81-c8b7-4b71-91b0-ec941d59dc1c</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk.config"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/console.log" append="off"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:22:12 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:22:12 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:22:12 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:22:12 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.824 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.824 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.825 182627 INFO nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Using config drive#033[00m
Jan 22 17:22:12 np0005592767 nova_compute[182623]: 2026-01-22 22:22:12.864 182627 DEBUG oslo_concurrency.processutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdw6siu_3" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:12 np0005592767 systemd-machined[153912]: New machine qemu-20-instance-00000027.
Jan 22 17:22:12 np0005592767 systemd[1]: Started Virtual Machine qemu-20-instance-00000027.
Jan 22 17:22:12 np0005592767 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 17:22:12 np0005592767 kernel: tapd48f9605-21: entered promiscuous mode
Jan 22 17:22:12 np0005592767 NetworkManager[54973]: <info>  [1769120532.9678] manager: (tapd48f9605-21): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.005 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:13Z|00143|binding|INFO|Claiming lport d48f9605-219d-4393-b6ef-7d9c783e0a18 for this chassis.
Jan 22 17:22:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:13Z|00144|binding|INFO|d48f9605-219d-4393-b6ef-7d9c783e0a18: Claiming fa:16:3e:1b:ac:f3 10.100.0.5
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.013 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.030 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:ac:f3 10.100.0.5'], port_security=['fa:16:3e:1b:ac:f3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3f7c3d83-9a90-4ca3-8206-22e36eef2e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d48f9605-219d-4393-b6ef-7d9c783e0a18) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.031 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d48f9605-219d-4393-b6ef-7d9c783e0a18 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 bound to our chassis#033[00m
Jan 22 17:22:13 np0005592767 systemd-udevd[216486]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.034 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd5f6392-bfb2-42bf-a825-c0516c8891b0#033[00m
Jan 22 17:22:13 np0005592767 NetworkManager[54973]: <info>  [1769120533.0540] device (tapd48f9605-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:22:13 np0005592767 NetworkManager[54973]: <info>  [1769120533.0549] device (tapd48f9605-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.054 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[77d5d5ce-23ee-4e64-9342-3179f2233c9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.056 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd5f6392-b1 in ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.058 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd5f6392-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.058 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a09734de-0a26-4d02-a26b-5c4b2fd694e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8352a8f6-0359-463b-b753-d1b52c5ef7c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 systemd-machined[153912]: New machine qemu-21-instance-00000028.
Jan 22 17:22:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:13Z|00145|binding|INFO|Setting lport d48f9605-219d-4393-b6ef-7d9c783e0a18 ovn-installed in OVS
Jan 22 17:22:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:13Z|00146|binding|INFO|Setting lport d48f9605-219d-4393-b6ef-7d9c783e0a18 up in Southbound
Jan 22 17:22:13 np0005592767 systemd[1]: Started Virtual Machine qemu-21-instance-00000028.
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.075 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.076 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8c53d3bd-a8ab-41d2-ac15-0f430cb71016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.106 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[647744b8-ea00-43da-8c6b-32e94371ba02]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.151 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9c3167-6013-4123-b12c-26825b0f793c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 NetworkManager[54973]: <info>  [1769120533.1617] manager: (tapdd5f6392-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.161 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e3cce2-5092-4845-82db-05a9d7de430e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.217 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[81692676-656a-4e6a-bb98-ae032ef861da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.221 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[df1e9ed9-ab18-4c73-8c96-fc37eb979ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 NetworkManager[54973]: <info>  [1769120533.2509] device (tapdd5f6392-b0): carrier: link connected
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.259 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120533.2586906, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.259 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.262 182627 DEBUG nova.compute.manager [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.264 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[920346bd-4738-4741-8265-e79a08afd19f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.265 182627 INFO nova.virt.libvirt.driver [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance running successfully.#033[00m
Jan 22 17:22:13 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.267 182627 DEBUG nova.virt.libvirt.guest [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.268 182627 DEBUG nova.virt.libvirt.driver [None req-539765d2-7830-4994-b586-bb5b0da775cc 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:22:13 np0005592767 podman[216524]: 2026-01-22 22:22:13.29444542 +0000 UTC m=+0.073520487 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.296 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[757c52aa-3d5a-4fc2-927e-bd1bf590fb04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412982, 'reachable_time': 15115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216545, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.298 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.307 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.325 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c6ec14-146a-4581-a6e8-e79eaa4caa62]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:d723'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412982, 'tstamp': 412982}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216553, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.344 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.344 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120533.2597904, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.344 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.349 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[52fd0cb4-f2eb-450d-994b-ad61255e0910]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412982, 'reachable_time': 15115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216554, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.367 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.371 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.398 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[53dddeb0-b671-41b3-9448-6e38a8f0e967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.475 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56b412a6-30a0-4664-99db-6cbf6b3a7976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.476 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.477 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.478 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd5f6392-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:13 np0005592767 NetworkManager[54973]: <info>  [1769120533.4805] manager: (tapdd5f6392-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 22 17:22:13 np0005592767 kernel: tapdd5f6392-b0: entered promiscuous mode
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.479 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.483 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.486 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd5f6392-b0, col_values=(('external_ids', {'iface-id': 'c2b5e191-6c34-4707-83d4-b3c5bc12ff1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.488 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:13Z|00147|binding|INFO|Releasing lport c2b5e191-6c34-4707-83d4-b3c5bc12ff1e from this chassis (sb_readonly=0)
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.511 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.513 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.514 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b5e9cf-8a5c-4019-849b-ea4301ccaef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.515 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:22:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:13.517 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'env', 'PROCESS_TAG=haproxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd5f6392-bfb2-42bf-a825-c0516c8891b0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.642 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120533.6416926, 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.642 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] VM Started (Lifecycle Event)#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.671 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.674 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120533.6419413, 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.674 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.699 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.701 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:13 np0005592767 nova_compute[182623]: 2026-01-22 22:22:13.728 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:22:13 np0005592767 podman[216593]: 2026-01-22 22:22:13.923845153 +0000 UTC m=+0.054703633 container create 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:22:13 np0005592767 systemd[1]: Started libpod-conmon-3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778.scope.
Jan 22 17:22:13 np0005592767 podman[216593]: 2026-01-22 22:22:13.893112611 +0000 UTC m=+0.023971111 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:22:13 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:22:13 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be81ad5dd346a6d09c021e7dc441b786ca06d59326d487a7956cc021b80a5fbd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:22:14 np0005592767 podman[216593]: 2026-01-22 22:22:14.005524001 +0000 UTC m=+0.136382571 container init 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:22:14 np0005592767 podman[216593]: 2026-01-22 22:22:14.01077476 +0000 UTC m=+0.141633260 container start 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:22:14 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [NOTICE]   (216612) : New worker (216614) forked
Jan 22 17:22:14 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [NOTICE]   (216612) : Loading success.
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.086 182627 DEBUG nova.compute.manager [req-9cc2464b-68dd-45d0-a343-b5bff29da7f8 req-17f0b9d8-354a-4bf1-89d3-2ff5a4dc4a40 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.086 182627 DEBUG oslo_concurrency.lockutils [req-9cc2464b-68dd-45d0-a343-b5bff29da7f8 req-17f0b9d8-354a-4bf1-89d3-2ff5a4dc4a40 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.086 182627 DEBUG oslo_concurrency.lockutils [req-9cc2464b-68dd-45d0-a343-b5bff29da7f8 req-17f0b9d8-354a-4bf1-89d3-2ff5a4dc4a40 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.087 182627 DEBUG oslo_concurrency.lockutils [req-9cc2464b-68dd-45d0-a343-b5bff29da7f8 req-17f0b9d8-354a-4bf1-89d3-2ff5a4dc4a40 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.087 182627 DEBUG nova.compute.manager [req-9cc2464b-68dd-45d0-a343-b5bff29da7f8 req-17f0b9d8-354a-4bf1-89d3-2ff5a4dc4a40 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Processing event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.087 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.091 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120534.0915394, 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.092 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.094 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.097 182627 INFO nova.virt.libvirt.driver [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Instance spawned successfully.#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.098 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.139 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.146 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.149 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.149 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.150 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.150 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.150 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.151 182627 DEBUG nova.virt.libvirt.driver [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.154 182627 DEBUG nova.network.neutron [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Updated VIF entry in instance network info cache for port d48f9605-219d-4393-b6ef-7d9c783e0a18. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.155 182627 DEBUG nova.network.neutron [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Updating instance_info_cache with network_info: [{"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.171 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.217 182627 DEBUG oslo_concurrency.lockutils [req-9c821871-aadb-458c-ba69-d937292cbfb4 req-cc2bb6b9-fafd-43c9-b359-aa821122b66b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-3f7c3d83-9a90-4ca3-8206-22e36eef2e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.240 182627 INFO nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Took 6.43 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.241 182627 DEBUG nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.321 182627 INFO nova.compute.manager [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Took 6.97 seconds to build instance.#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.337 182627 DEBUG oslo_concurrency.lockutils [None req-2cb09e60-5d5c-48f3-a648-1f232e9a9b0f 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.941 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.942 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.942 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:14 np0005592767 nova_compute[182623]: 2026-01-22 22:22:14.942 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.053 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.109 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.110 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.164 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.170 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.224 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.225 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.278 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.473 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.475 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5502MB free_disk=73.24510192871094GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.475 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.475 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.544 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Applying migration context for instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c as it has an incoming, in-progress migration 90e6c5da-445b-4739-932f-03c7172db2c7. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.545 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating resource usage from migration 90e6c5da-445b-4739-932f-03c7172db2c7#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.560 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.561 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.561 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.561 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.616 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.634 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.660 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:22:15 np0005592767 nova_compute[182623]: 2026-01-22 22:22:15.660 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.306 182627 DEBUG nova.compute.manager [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.306 182627 DEBUG oslo_concurrency.lockutils [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.307 182627 DEBUG oslo_concurrency.lockutils [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.307 182627 DEBUG oslo_concurrency.lockutils [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.307 182627 DEBUG nova.compute.manager [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] No waiting events found dispatching network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.307 182627 WARNING nova.compute.manager [req-77489e13-bbfe-480a-a2c2-75b7fed01051 req-89ef284b-6369-4db3-ae45-3f873912cf3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received unexpected event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.740 182627 INFO nova.compute.manager [None req-e883b8ee-5c41-4d8f-b9de-df831865e014 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Pausing#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.742 182627 DEBUG nova.objects.instance [None req-e883b8ee-5c41-4d8f-b9de-df831865e014 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'flavor' on Instance uuid 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.776 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120536.7758791, 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.776 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.778 182627 DEBUG nova.compute.manager [None req-e883b8ee-5c41-4d8f-b9de-df831865e014 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.801 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.804 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:16 np0005592767 nova_compute[182623]: 2026-01-22 22:22:16.831 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 22 17:22:17 np0005592767 nova_compute[182623]: 2026-01-22 22:22:17.152 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:19 np0005592767 nova_compute[182623]: 2026-01-22 22:22:19.712 182627 DEBUG nova.compute.manager [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:19 np0005592767 nova_compute[182623]: 2026-01-22 22:22:19.776 182627 INFO nova.compute.manager [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] instance snapshotting#033[00m
Jan 22 17:22:19 np0005592767 nova_compute[182623]: 2026-01-22 22:22:19.777 182627 WARNING nova.compute.manager [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.220 182627 INFO nova.virt.libvirt.driver [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Beginning live snapshot process#033[00m
Jan 22 17:22:20 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.633 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.697 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.698 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.757 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.777 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.838 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.840 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.880 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.882 182627 INFO nova.virt.libvirt.driver [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.948 182627 DEBUG nova.virt.libvirt.guest [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:22:20 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.951 182627 INFO nova.virt.libvirt.driver [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:22:21 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.997 182627 DEBUG nova.privsep.utils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:22:21 np0005592767 nova_compute[182623]: 2026-01-22 22:22:20.998 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4.delta /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:21 np0005592767 nova_compute[182623]: 2026-01-22 22:22:21.150 182627 DEBUG oslo_concurrency.processutils [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4.delta /var/lib/nova/instances/snapshots/tmpwcxy1vse/1d1a83c119824058a3ebe45d84c84af4" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:21 np0005592767 nova_compute[182623]: 2026-01-22 22:22:21.153 182627 INFO nova.virt.libvirt.driver [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:22:21 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:22:21 np0005592767 systemd[216417]: Activating special unit Exit the Session...
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped target Main User Target.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped target Basic System.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped target Paths.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped target Sockets.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped target Timers.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:21 np0005592767 systemd[216417]: Closed D-Bus User Message Bus Socket.
Jan 22 17:22:21 np0005592767 systemd[216417]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:22:21 np0005592767 systemd[216417]: Removed slice User Application Slice.
Jan 22 17:22:21 np0005592767 systemd[216417]: Reached target Shutdown.
Jan 22 17:22:21 np0005592767 systemd[216417]: Finished Exit the Session.
Jan 22 17:22:21 np0005592767 systemd[216417]: Reached target Exit the Session.
Jan 22 17:22:21 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:22:21 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:22:21 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:22:21 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:22:21 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:22:21 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:22:21 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:22:22 np0005592767 nova_compute[182623]: 2026-01-22 22:22:22.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:23 np0005592767 nova_compute[182623]: 2026-01-22 22:22:23.259 182627 INFO nova.virt.libvirt.driver [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Snapshot image upload complete#033[00m
Jan 22 17:22:23 np0005592767 nova_compute[182623]: 2026-01-22 22:22:23.260 182627 INFO nova.compute.manager [None req-1b05edf2-fe41-4aaa-8ff0-e8a857c6c9ee 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Took 3.47 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.965 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.966 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.966 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.966 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.966 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.980 182627 INFO nova.compute.manager [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Terminating instance#033[00m
Jan 22 17:22:25 np0005592767 nova_compute[182623]: 2026-01-22 22:22:25.992 182627 DEBUG nova.compute.manager [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:22:26 np0005592767 kernel: tapd48f9605-21 (unregistering): left promiscuous mode
Jan 22 17:22:26 np0005592767 NetworkManager[54973]: <info>  [1769120546.0116] device (tapd48f9605-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:22:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:26Z|00148|binding|INFO|Releasing lport d48f9605-219d-4393-b6ef-7d9c783e0a18 from this chassis (sb_readonly=0)
Jan 22 17:22:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:26Z|00149|binding|INFO|Setting lport d48f9605-219d-4393-b6ef-7d9c783e0a18 down in Southbound
Jan 22 17:22:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:26Z|00150|binding|INFO|Removing iface tapd48f9605-21 ovn-installed in OVS
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.037 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:ac:f3 10.100.0.5'], port_security=['fa:16:3e:1b:ac:f3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3f7c3d83-9a90-4ca3-8206-22e36eef2e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d48f9605-219d-4393-b6ef-7d9c783e0a18) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.038 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d48f9605-219d-4393-b6ef-7d9c783e0a18 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.040 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.041 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2685a058-15fb-417e-9b8d-1c03452d12d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.041 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace which is not needed anymore#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.050 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 22 17:22:26 np0005592767 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Consumed 3.294s CPU time.
Jan 22 17:22:26 np0005592767 systemd-machined[153912]: Machine qemu-21-instance-00000028 terminated.
Jan 22 17:22:26 np0005592767 podman[216680]: 2026-01-22 22:22:26.126070867 +0000 UTC m=+0.075866224 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [NOTICE]   (216612) : haproxy version is 2.8.14-c23fe91
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [NOTICE]   (216612) : path to executable is /usr/sbin/haproxy
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [WARNING]  (216612) : Exiting Master process...
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [WARNING]  (216612) : Exiting Master process...
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [ALERT]    (216612) : Current worker (216614) exited with code 143 (Terminated)
Jan 22 17:22:26 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216608]: [WARNING]  (216612) : All workers exited. Exiting... (0)
Jan 22 17:22:26 np0005592767 systemd[1]: libpod-3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778.scope: Deactivated successfully.
Jan 22 17:22:26 np0005592767 podman[216725]: 2026-01-22 22:22:26.193681186 +0000 UTC m=+0.051789831 container died 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:22:26 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778-userdata-shm.mount: Deactivated successfully.
Jan 22 17:22:26 np0005592767 systemd[1]: var-lib-containers-storage-overlay-be81ad5dd346a6d09c021e7dc441b786ca06d59326d487a7956cc021b80a5fbd-merged.mount: Deactivated successfully.
Jan 22 17:22:26 np0005592767 podman[216725]: 2026-01-22 22:22:26.241053941 +0000 UTC m=+0.099162596 container cleanup 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.257 182627 INFO nova.virt.libvirt.driver [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Instance destroyed successfully.#033[00m
Jan 22 17:22:26 np0005592767 systemd[1]: libpod-conmon-3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778.scope: Deactivated successfully.
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.259 182627 DEBUG nova.objects.instance [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'resources' on Instance uuid 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.271 182627 DEBUG nova.virt.libvirt.vif [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:22:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2092297008',display_name='tempest-ImagesTestJSON-server-2092297008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2092297008',id=40,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:22:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-ksigyrm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:22:23Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=3f7c3d83-9a90-4ca3-8206-22e36eef2e04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.272 182627 DEBUG nova.network.os_vif_util [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "address": "fa:16:3e:1b:ac:f3", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd48f9605-21", "ovs_interfaceid": "d48f9605-219d-4393-b6ef-7d9c783e0a18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.273 182627 DEBUG nova.network.os_vif_util [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.273 182627 DEBUG os_vif [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.275 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.276 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd48f9605-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.277 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.281 182627 INFO os_vif [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:ac:f3,bridge_name='br-int',has_traffic_filtering=True,id=d48f9605-219d-4393-b6ef-7d9c783e0a18,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd48f9605-21')#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.282 182627 INFO nova.virt.libvirt.driver [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Deleting instance files /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04_del#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.283 182627 INFO nova.virt.libvirt.driver [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Deletion of /var/lib/nova/instances/3f7c3d83-9a90-4ca3-8206-22e36eef2e04_del complete#033[00m
Jan 22 17:22:26 np0005592767 podman[216771]: 2026-01-22 22:22:26.305749287 +0000 UTC m=+0.040752048 container remove 3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.311 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f04083be-7fd0-4507-b685-07f90f2a222d]: (4, ('Thu Jan 22 10:22:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778)\n3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778\nThu Jan 22 10:22:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778)\n3b38b8ba60b0c85829222af2ffa734a3138b623ad3429955732e668d12fae778\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.313 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab4b902-00a7-48fe-8743-ee01399290e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.314 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.316 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 kernel: tapdd5f6392-b0: left promiscuous mode
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.334 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.337 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e14163c8-ad61-4a72-abb4-d5441037c9c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.353 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef14e06-0f2f-4933-818f-919eb01195f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.354 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[998be002-bc03-4114-becf-06510a22c2aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.357 182627 INFO nova.compute.manager [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.358 182627 DEBUG oslo.service.loopingcall [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.358 182627 DEBUG nova.compute.manager [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:22:26 np0005592767 nova_compute[182623]: 2026-01-22 22:22:26.359 182627 DEBUG nova.network.neutron [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.369 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfd2435-5774-4fe0-8807-363adc1930e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412972, 'reachable_time': 32822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216785, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:26 np0005592767 systemd[1]: run-netns-ovnmeta\x2ddd5f6392\x2dbfb2\x2d42bf\x2da825\x2dc0516c8891b0.mount: Deactivated successfully.
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.372 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:22:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:26.372 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a694e709-3722-4e44-86ac-293ab8b88d9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.326 182627 DEBUG nova.network.neutron [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.349 182627 INFO nova.compute.manager [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Took 0.99 seconds to deallocate network for instance.#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.392 182627 DEBUG nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-unplugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.392 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.393 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.393 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.393 182627 DEBUG nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] No waiting events found dispatching network-vif-unplugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.394 182627 DEBUG nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-unplugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.394 182627 DEBUG nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.394 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.395 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.395 182627 DEBUG oslo_concurrency.lockutils [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.395 182627 DEBUG nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] No waiting events found dispatching network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.396 182627 WARNING nova.compute.manager [req-e500d494-48bc-4ef6-a811-23a9916ffa41 req-8c5b3608-53fc-4703-8a0e-774d2ac02d95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received unexpected event network-vif-plugged-d48f9605-219d-4393-b6ef-7d9c783e0a18 for instance with vm_state paused and task_state deleting.#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.454 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.455 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.497 182627 DEBUG nova.compute.manager [req-3ac083f5-63fb-4474-97b7-7ec507369791 req-5528be02-886e-48ee-9606-c041d0c024f8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Received event network-vif-deleted-d48f9605-219d-4393-b6ef-7d9c783e0a18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.554 182627 DEBUG nova.compute.provider_tree [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.569 182627 DEBUG nova.scheduler.client.report [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.595 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.619 182627 INFO nova.scheduler.client.report [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Deleted allocations for instance 3f7c3d83-9a90-4ca3-8206-22e36eef2e04#033[00m
Jan 22 17:22:27 np0005592767 nova_compute[182623]: 2026-01-22 22:22:27.695 182627 DEBUG oslo_concurrency.lockutils [None req-c48ace23-1cdd-4c38-8d99-9603775b0b89 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "3f7c3d83-9a90-4ca3-8206-22e36eef2e04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.714 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.715 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.743 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.865 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.865 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.871 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:22:28 np0005592767 nova_compute[182623]: 2026-01-22 22:22:28.871 182627 INFO nova.compute.claims [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.075 182627 DEBUG nova.compute.provider_tree [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.092 182627 DEBUG nova.scheduler.client.report [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.114 182627 DEBUG nova.compute.manager [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.121 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.121 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.197 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.198 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.216 182627 INFO nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.229 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.229 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.232 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.257 182627 DEBUG nova.objects.instance [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_requests' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.276 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.277 182627 INFO nova.compute.claims [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.277 182627 DEBUG nova.objects.instance [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.292 182627 DEBUG nova.objects.instance [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'pci_devices' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.349 182627 INFO nova.compute.resource_tracker [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Updating resource usage from migration 4d278d51-e944-41b0-aee1-7ca339e33dd0#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.349 182627 DEBUG nova.compute.resource_tracker [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Starting to track incoming migration 4d278d51-e944-41b0-aee1-7ca339e33dd0 with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.368 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.369 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.369 182627 INFO nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Creating image(s)#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.370 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.370 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.371 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.385 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.453 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.454 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.454 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.464 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.479 182627 DEBUG nova.policy [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d9fe7f0e8b4edf92fa2064aaab8bca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3a2ee662fba426c8f688455b20759bf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.514 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.515 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.535 182627 DEBUG nova.compute.provider_tree [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.550 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.550 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.551 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.567 182627 DEBUG nova.scheduler.client.report [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.589 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.589 182627 INFO nova.compute.manager [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Migrating#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.603 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.603 182627 DEBUG nova.virt.disk.api [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Checking if we can resize image /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.604 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.656 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.657 182627 DEBUG nova.virt.disk.api [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Cannot resize image /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.658 182627 DEBUG nova.objects.instance [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'migration_context' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.692 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.693 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Ensure instance console log exists: /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.694 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.694 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:29 np0005592767 nova_compute[182623]: 2026-01-22 22:22:29.695 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:30 np0005592767 nova_compute[182623]: 2026-01-22 22:22:30.059 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:30 np0005592767 podman[216802]: 2026-01-22 22:22:30.148889504 +0000 UTC m=+0.067217249 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350)
Jan 22 17:22:30 np0005592767 podman[216801]: 2026-01-22 22:22:30.165450294 +0000 UTC m=+0.083551963 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:22:30 np0005592767 nova_compute[182623]: 2026-01-22 22:22:30.395 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Successfully created port: b5915651-92b6-4103-a6ab-70b90dfac2b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:22:30 np0005592767 systemd-logind[802]: New session 38 of user nova.
Jan 22 17:22:30 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:22:30 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:22:30 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:22:30 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:22:30 np0005592767 systemd[216851]: Queued start job for default target Main User Target.
Jan 22 17:22:30 np0005592767 systemd[216851]: Created slice User Application Slice.
Jan 22 17:22:30 np0005592767 systemd[216851]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:30 np0005592767 systemd[216851]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:30 np0005592767 systemd[216851]: Reached target Paths.
Jan 22 17:22:30 np0005592767 systemd[216851]: Reached target Timers.
Jan 22 17:22:30 np0005592767 systemd[216851]: Starting D-Bus User Message Bus Socket...
Jan 22 17:22:30 np0005592767 systemd[216851]: Starting Create User's Volatile Files and Directories...
Jan 22 17:22:30 np0005592767 systemd[216851]: Finished Create User's Volatile Files and Directories.
Jan 22 17:22:30 np0005592767 systemd[216851]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:22:30 np0005592767 systemd[216851]: Reached target Sockets.
Jan 22 17:22:30 np0005592767 systemd[216851]: Reached target Basic System.
Jan 22 17:22:30 np0005592767 systemd[216851]: Reached target Main User Target.
Jan 22 17:22:30 np0005592767 systemd[216851]: Startup finished in 172ms.
Jan 22 17:22:30 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:22:30 np0005592767 systemd[1]: Started Session 38 of User nova.
Jan 22 17:22:30 np0005592767 systemd[1]: session-38.scope: Deactivated successfully.
Jan 22 17:22:30 np0005592767 systemd-logind[802]: Session 38 logged out. Waiting for processes to exit.
Jan 22 17:22:30 np0005592767 systemd-logind[802]: Removed session 38.
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.091 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Successfully updated port: b5915651-92b6-4103-a6ab-70b90dfac2b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.106 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.107 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquired lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.107 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:22:31 np0005592767 systemd-logind[802]: New session 40 of user nova.
Jan 22 17:22:31 np0005592767 systemd[1]: Started Session 40 of User nova.
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.240 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:22:31 np0005592767 systemd[1]: session-40.scope: Deactivated successfully.
Jan 22 17:22:31 np0005592767 systemd-logind[802]: Session 40 logged out. Waiting for processes to exit.
Jan 22 17:22:31 np0005592767 systemd-logind[802]: Removed session 40.
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.619 182627 DEBUG nova.compute.manager [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-changed-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.620 182627 DEBUG nova.compute.manager [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Refreshing instance network info cache due to event network-changed-b5915651-92b6-4103-a6ab-70b90dfac2b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:22:31 np0005592767 nova_compute[182623]: 2026-01-22 22:22:31.620 182627 DEBUG oslo_concurrency.lockutils [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:32 np0005592767 nova_compute[182623]: 2026-01-22 22:22:32.992 182627 DEBUG nova.network.neutron [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Updating instance_info_cache with network_info: [{"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.018 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Releasing lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.018 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance network_info: |[{"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.019 182627 DEBUG oslo_concurrency.lockutils [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.019 182627 DEBUG nova.network.neutron [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Refreshing network info cache for port b5915651-92b6-4103-a6ab-70b90dfac2b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.024 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Start _get_guest_xml network_info=[{"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.032 182627 WARNING nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.044 182627 DEBUG nova.virt.libvirt.host [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.046 182627 DEBUG nova.virt.libvirt.host [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.052 182627 DEBUG nova.virt.libvirt.host [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.053 182627 DEBUG nova.virt.libvirt.host [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.055 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.056 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.056 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.057 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.057 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.057 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.058 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.058 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.059 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.059 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.059 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.060 182627 DEBUG nova.virt.hardware [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.065 182627 DEBUG nova.virt.libvirt.vif [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1916328514',display_name='tempest-ImagesTestJSON-server-1916328514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1916328514',id=43,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-7bstce2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:29Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=87d4ec2c-3bc5-4c68-826d-9403021dd81a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.066 182627 DEBUG nova.network.os_vif_util [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.067 182627 DEBUG nova.network.os_vif_util [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.069 182627 DEBUG nova.objects.instance [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.086 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <uuid>87d4ec2c-3bc5-4c68-826d-9403021dd81a</uuid>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <name>instance-0000002b</name>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:name>tempest-ImagesTestJSON-server-1916328514</nova:name>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:22:33</nova:creationTime>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:user uuid="52d9fe7f0e8b4edf92fa2064aaab8bca">tempest-ImagesTestJSON-23148374-project-member</nova:user>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:project uuid="d3a2ee662fba426c8f688455b20759bf">tempest-ImagesTestJSON-23148374</nova:project>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        <nova:port uuid="b5915651-92b6-4103-a6ab-70b90dfac2b8">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="serial">87d4ec2c-3bc5-4c68-826d-9403021dd81a</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="uuid">87d4ec2c-3bc5-4c68-826d-9403021dd81a</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.config"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:a9:33:13"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <target dev="tapb5915651-92"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/console.log" append="off"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:22:33 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:22:33 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:22:33 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:22:33 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.087 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Preparing to wait for external event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.088 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.088 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.088 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.089 182627 DEBUG nova.virt.libvirt.vif [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:22:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1916328514',display_name='tempest-ImagesTestJSON-server-1916328514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1916328514',id=43,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-7bstce2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:22:29Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=87d4ec2c-3bc5-4c68-826d-9403021dd81a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.090 182627 DEBUG nova.network.os_vif_util [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.091 182627 DEBUG nova.network.os_vif_util [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.092 182627 DEBUG os_vif [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.093 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.094 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.095 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.100 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5915651-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.100 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5915651-92, col_values=(('external_ids', {'iface-id': 'b5915651-92b6-4103-a6ab-70b90dfac2b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:33:13', 'vm-uuid': '87d4ec2c-3bc5-4c68-826d-9403021dd81a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:33 np0005592767 NetworkManager[54973]: <info>  [1769120553.1043] manager: (tapb5915651-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.119 182627 INFO os_vif [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92')#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.220 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.220 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.221 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No VIF found with MAC fa:16:3e:a9:33:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.222 182627 INFO nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Using config drive#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.732 182627 INFO nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Creating config drive at /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.config#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.737 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8i8rvw95 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.864 182627 DEBUG oslo_concurrency.processutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8i8rvw95" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:33 np0005592767 kernel: tapb5915651-92: entered promiscuous mode
Jan 22 17:22:33 np0005592767 NetworkManager[54973]: <info>  [1769120553.9505] manager: (tapb5915651-92): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Jan 22 17:22:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:33Z|00151|binding|INFO|Claiming lport b5915651-92b6-4103-a6ab-70b90dfac2b8 for this chassis.
Jan 22 17:22:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:33Z|00152|binding|INFO|b5915651-92b6-4103-a6ab-70b90dfac2b8: Claiming fa:16:3e:a9:33:13 10.100.0.12
Jan 22 17:22:33 np0005592767 nova_compute[182623]: 2026-01-22 22:22:33.989 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:33.996 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:33:13 10.100.0.12'], port_security=['fa:16:3e:a9:33:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87d4ec2c-3bc5-4c68-826d-9403021dd81a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b5915651-92b6-4103-a6ab-70b90dfac2b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:22:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:33.997 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b5915651-92b6-4103-a6ab-70b90dfac2b8 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 bound to our chassis#033[00m
Jan 22 17:22:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:33.998 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd5f6392-bfb2-42bf-a825-c0516c8891b0#033[00m
Jan 22 17:22:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:34Z|00153|binding|INFO|Setting lport b5915651-92b6-4103-a6ab-70b90dfac2b8 ovn-installed in OVS
Jan 22 17:22:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:34Z|00154|binding|INFO|Setting lport b5915651-92b6-4103-a6ab-70b90dfac2b8 up in Southbound
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.009 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.012 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dc45d110-7625-4f96-8831-0e7f7fa7a4f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.013 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd5f6392-b1 in ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.016 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd5f6392-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.016 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0d55071e-caa3-4f2c-a0c0-07ad13fc971a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.017 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[55c9fc77-581e-4ef8-a453-4dd6e1cda68b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 systemd-machined[153912]: New machine qemu-22-instance-0000002b.
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.032 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[31a3ffb4-4491-49a2-a8f2-128ce62eb2c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 systemd[1]: Started Virtual Machine qemu-22-instance-0000002b.
Jan 22 17:22:34 np0005592767 systemd-udevd[216896]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.047 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4adc2e56-85b0-4683-937c-4c20a61a931e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 NetworkManager[54973]: <info>  [1769120554.0635] device (tapb5915651-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:22:34 np0005592767 NetworkManager[54973]: <info>  [1769120554.0640] device (tapb5915651-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.085 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f352bc60-28f3-4b94-b5d6-643da4817279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 systemd-udevd[216901]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:22:34 np0005592767 NetworkManager[54973]: <info>  [1769120554.0931] manager: (tapdd5f6392-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.092 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a10cec36-65e4-4ddf-81af-51c073251128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.127 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[00e5d361-b064-4a6e-ace1-6d32eb48d647]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.131 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[57c5105d-e2eb-4e4d-964b-92e845cf0397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 NetworkManager[54973]: <info>  [1769120554.1559] device (tapdd5f6392-b0): carrier: link connected
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.163 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6d309d-80d1-46a5-9a7a-9dbbdc16ace7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.180 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e11e5f65-d15e-419d-8e5b-9e8ee500eb92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415073, 'reachable_time': 23322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216926, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.195 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc0010c-85f0-437a-bc26-0f61df6cb388]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:d723'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415073, 'tstamp': 415073}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216927, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.210 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a124056-61f7-44b9-977c-762493d91eac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415073, 'reachable_time': 23322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216928, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.238 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f7438f4a-7a8c-4ab6-bfb4-7965a75330e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.293 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3b0e5d9a-0fb2-4b64-a9c3-4360b35ed87a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.294 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.294 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.295 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd5f6392-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 NetworkManager[54973]: <info>  [1769120554.2972] manager: (tapdd5f6392-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 22 17:22:34 np0005592767 kernel: tapdd5f6392-b0: entered promiscuous mode
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.299 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.300 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd5f6392-b0, col_values=(('external_ids', {'iface-id': 'c2b5e191-6c34-4707-83d4-b3c5bc12ff1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.300 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:34Z|00155|binding|INFO|Releasing lport c2b5e191-6c34-4707-83d4-b3c5bc12ff1e from this chassis (sb_readonly=0)
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.315 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.315 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.316 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd30a33-b62c-4753-9416-4ae1dc96f686]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.317 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:22:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:34.317 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'env', 'PROCESS_TAG=haproxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd5f6392-bfb2-42bf-a825-c0516c8891b0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.566 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120554.5654106, 87d4ec2c-3bc5-4c68-826d-9403021dd81a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.566 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] VM Started (Lifecycle Event)#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.604 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.608 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120554.5704618, 87d4ec2c-3bc5-4c68-826d-9403021dd81a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.609 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.629 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.635 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:34 np0005592767 nova_compute[182623]: 2026-01-22 22:22:34.662 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:22:34 np0005592767 podman[216965]: 2026-01-22 22:22:34.731840907 +0000 UTC m=+0.060597251 container create a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:22:34 np0005592767 systemd[1]: Started libpod-conmon-a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f.scope.
Jan 22 17:22:34 np0005592767 podman[216965]: 2026-01-22 22:22:34.701836686 +0000 UTC m=+0.030593090 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:22:34 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:22:34 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bb5ddfe664bc5fe31ebc8709bb41d73bd8451e82c2c799aab40a20c2c276452/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:22:34 np0005592767 podman[216965]: 2026-01-22 22:22:34.821299366 +0000 UTC m=+0.150055750 container init a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:22:34 np0005592767 podman[216965]: 2026-01-22 22:22:34.828076749 +0000 UTC m=+0.156833103 container start a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:22:34 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [NOTICE]   (216985) : New worker (216987) forked
Jan 22 17:22:34 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [NOTICE]   (216985) : Loading success.
Jan 22 17:22:35 np0005592767 nova_compute[182623]: 2026-01-22 22:22:35.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:35 np0005592767 nova_compute[182623]: 2026-01-22 22:22:35.708 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:35.708 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:22:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:35.711 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.005 182627 DEBUG nova.network.neutron [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Updated VIF entry in instance network info cache for port b5915651-92b6-4103-a6ab-70b90dfac2b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.006 182627 DEBUG nova.network.neutron [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Updating instance_info_cache with network_info: [{"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.028 182627 DEBUG oslo_concurrency.lockutils [req-5712047d-c4ee-4509-8fe3-37a76b307542 req-2d566bca-95ff-46a0-9569-abed52c6cbe7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-87d4ec2c-3bc5-4c68-826d-9403021dd81a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.876 182627 DEBUG nova.compute.manager [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.876 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.877 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.877 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.877 182627 DEBUG nova.compute.manager [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Processing event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.877 182627 DEBUG nova.compute.manager [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.878 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.878 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.878 182627 DEBUG oslo_concurrency.lockutils [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.878 182627 DEBUG nova.compute.manager [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] No waiting events found dispatching network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.879 182627 WARNING nova.compute.manager [req-dcdeea0f-0b35-48ab-82b1-ac055c383396 req-37e61b1d-7215-4c75-9d9a-5e7272749b97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received unexpected event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.879 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.884 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120556.8841462, 87d4ec2c-3bc5-4c68-826d-9403021dd81a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.884 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.914 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.918 182627 INFO nova.virt.libvirt.driver [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance spawned successfully.#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.918 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.950 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.954 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.954 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.954 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.955 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.955 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.956 182627 DEBUG nova.virt.libvirt.driver [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.959 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:36 np0005592767 nova_compute[182623]: 2026-01-22 22:22:36.994 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:22:37 np0005592767 nova_compute[182623]: 2026-01-22 22:22:37.042 182627 INFO nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Took 7.67 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:22:37 np0005592767 nova_compute[182623]: 2026-01-22 22:22:37.043 182627 DEBUG nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:37 np0005592767 nova_compute[182623]: 2026-01-22 22:22:37.151 182627 INFO nova.compute.manager [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Took 8.32 seconds to build instance.#033[00m
Jan 22 17:22:37 np0005592767 podman[216996]: 2026-01-22 22:22:37.156940216 +0000 UTC m=+0.077850451 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:22:37 np0005592767 nova_compute[182623]: 2026-01-22 22:22:37.175 182627 DEBUG oslo_concurrency.lockutils [None req-78a9ce76-24ab-41cf-bc05-43067f91800d 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:37 np0005592767 podman[216997]: 2026-01-22 22:22:37.180861285 +0000 UTC m=+0.091926631 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:22:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:22:37.713 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.398 182627 DEBUG oslo_concurrency.lockutils [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.399 182627 DEBUG oslo_concurrency.lockutils [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.399 182627 DEBUG nova.compute.manager [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.403 182627 DEBUG nova.compute.manager [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.404 182627 DEBUG nova.objects.instance [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'flavor' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.439 182627 DEBUG nova.objects.instance [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'info_cache' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:38 np0005592767 nova_compute[182623]: 2026-01-22 22:22:38.475 182627 DEBUG nova.virt.libvirt.driver [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:22:40 np0005592767 nova_compute[182623]: 2026-01-22 22:22:40.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:41 np0005592767 nova_compute[182623]: 2026-01-22 22:22:41.256 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120546.2543344, 3f7c3d83-9a90-4ca3-8206-22e36eef2e04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:41 np0005592767 nova_compute[182623]: 2026-01-22 22:22:41.257 182627 INFO nova.compute.manager [-] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:22:41 np0005592767 nova_compute[182623]: 2026-01-22 22:22:41.292 182627 DEBUG nova.compute.manager [None req-b35ccee0-101f-4494-8def-804a2e597f15 - - - - - -] [instance: 3f7c3d83-9a90-4ca3-8206-22e36eef2e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:41 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:22:41 np0005592767 systemd[216851]: Activating special unit Exit the Session...
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped target Main User Target.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped target Basic System.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped target Paths.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped target Sockets.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped target Timers.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:41 np0005592767 systemd[216851]: Closed D-Bus User Message Bus Socket.
Jan 22 17:22:41 np0005592767 systemd[216851]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:22:41 np0005592767 systemd[216851]: Removed slice User Application Slice.
Jan 22 17:22:41 np0005592767 systemd[216851]: Reached target Shutdown.
Jan 22 17:22:41 np0005592767 systemd[216851]: Finished Exit the Session.
Jan 22 17:22:41 np0005592767 systemd[216851]: Reached target Exit the Session.
Jan 22 17:22:41 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:22:41 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:22:41 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:22:41 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:22:41 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:22:41 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:22:41 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:22:43 np0005592767 nova_compute[182623]: 2026-01-22 22:22:43.137 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:44 np0005592767 podman[217038]: 2026-01-22 22:22:44.152451613 +0000 UTC m=+0.071445019 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:22:44 np0005592767 systemd-logind[802]: New session 41 of user nova.
Jan 22 17:22:44 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:22:44 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:22:44 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:22:44 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:22:44 np0005592767 systemd[217066]: Queued start job for default target Main User Target.
Jan 22 17:22:44 np0005592767 systemd[217066]: Created slice User Application Slice.
Jan 22 17:22:44 np0005592767 systemd[217066]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:44 np0005592767 systemd[217066]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:44 np0005592767 systemd[217066]: Reached target Paths.
Jan 22 17:22:44 np0005592767 systemd[217066]: Reached target Timers.
Jan 22 17:22:44 np0005592767 systemd[217066]: Starting D-Bus User Message Bus Socket...
Jan 22 17:22:44 np0005592767 systemd[217066]: Starting Create User's Volatile Files and Directories...
Jan 22 17:22:44 np0005592767 systemd[217066]: Finished Create User's Volatile Files and Directories.
Jan 22 17:22:44 np0005592767 systemd[217066]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:22:44 np0005592767 systemd[217066]: Reached target Sockets.
Jan 22 17:22:44 np0005592767 systemd[217066]: Reached target Basic System.
Jan 22 17:22:44 np0005592767 systemd[217066]: Reached target Main User Target.
Jan 22 17:22:44 np0005592767 systemd[217066]: Startup finished in 132ms.
Jan 22 17:22:44 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:22:44 np0005592767 systemd[1]: Started Session 41 of User nova.
Jan 22 17:22:45 np0005592767 nova_compute[182623]: 2026-01-22 22:22:45.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:45 np0005592767 systemd[1]: session-41.scope: Deactivated successfully.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Session 41 logged out. Waiting for processes to exit.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Removed session 41.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: New session 43 of user nova.
Jan 22 17:22:45 np0005592767 systemd[1]: Started Session 43 of User nova.
Jan 22 17:22:45 np0005592767 systemd[1]: session-43.scope: Deactivated successfully.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Session 43 logged out. Waiting for processes to exit.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Removed session 43.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: New session 44 of user nova.
Jan 22 17:22:45 np0005592767 systemd[1]: Started Session 44 of User nova.
Jan 22 17:22:45 np0005592767 systemd[1]: session-44.scope: Deactivated successfully.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Session 44 logged out. Waiting for processes to exit.
Jan 22 17:22:45 np0005592767 systemd-logind[802]: Removed session 44.
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.224 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.226 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.227 182627 DEBUG nova.network.neutron [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.358 182627 DEBUG nova.network.neutron [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.874 182627 DEBUG nova.network.neutron [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:46 np0005592767 nova_compute[182623]: 2026-01-22 22:22:46.898 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.302 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.307 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.308 182627 INFO nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Creating image(s)#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.310 182627 DEBUG nova.objects.instance [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.329 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.415 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.417 182627 DEBUG nova.virt.disk.api [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Checking if we can resize image /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.418 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.484 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.486 182627 DEBUG nova.virt.disk.api [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Cannot resize image /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.519 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.520 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Ensure instance console log exists: /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.522 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.523 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.524 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.528 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.538 182627 WARNING nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.548 182627 DEBUG nova.virt.libvirt.host [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.549 182627 DEBUG nova.virt.libvirt.host [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.554 182627 DEBUG nova.virt.libvirt.host [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.555 182627 DEBUG nova.virt.libvirt.host [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.558 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.559 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.561 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.562 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.563 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.563 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.564 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.565 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.566 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.566 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.567 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.568 182627 DEBUG nova.virt.hardware [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.568 182627 DEBUG nova.objects.instance [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.594 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.669 182627 DEBUG oslo_concurrency.processutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.config --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.672 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.673 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.675 182627 DEBUG oslo_concurrency.lockutils [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.680 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <uuid>8577d985-21ea-4107-ba03-87076f31b935</uuid>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <name>instance-00000029</name>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <memory>196608</memory>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:name>tempest-MigrationsAdminTest-server-33381701</nova:name>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:22:47</nova:creationTime>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.micro">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:memory>192</nova:memory>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:        <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="serial">8577d985-21ea-4107-ba03-87076f31b935</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="uuid">8577d985-21ea-4107-ba03-87076f31b935</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/disk.config"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/console.log" append="off"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:22:47 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:22:47 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:22:47 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:22:47 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.762 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.763 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:22:47 np0005592767 nova_compute[182623]: 2026-01-22 22:22:47.764 182627 INFO nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Using config drive#033[00m
Jan 22 17:22:47 np0005592767 systemd-machined[153912]: New machine qemu-23-instance-00000029.
Jan 22 17:22:47 np0005592767 systemd[1]: Started Virtual Machine qemu-23-instance-00000029.
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.142 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.540 182627 DEBUG nova.virt.libvirt.driver [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.630 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120568.630079, 8577d985-21ea-4107-ba03-87076f31b935 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.631 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.638 182627 DEBUG nova.compute.manager [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.642 182627 INFO nova.virt.libvirt.driver [-] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance running successfully.#033[00m
Jan 22 17:22:48 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.645 182627 DEBUG nova.virt.libvirt.guest [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.646 182627 DEBUG nova.virt.libvirt.driver [None req-c15948a9-3089-4415-88ff-937cd2a48a70 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.649 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.652 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.676 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.677 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120568.637853, 8577d985-21ea-4107-ba03-87076f31b935 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.677 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] VM Started (Lifecycle Event)#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.703 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:22:48 np0005592767 nova_compute[182623]: 2026-01-22 22:22:48.707 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:50Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:33:13 10.100.0.12
Jan 22 17:22:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:22:50Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:33:13 10.100.0.12
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.277 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.278 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.278 182627 DEBUG nova.network.neutron [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.375 182627 DEBUG nova.network.neutron [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.565 182627 DEBUG nova.network.neutron [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.580 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-8577d985-21ea-4107-ba03-87076f31b935" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.595 182627 DEBUG nova.virt.libvirt.driver [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Creating tmpfile /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935/tmpfsbmcgiy to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Jan 22 17:22:50 np0005592767 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 22 17:22:50 np0005592767 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000029.scope: Consumed 2.670s CPU time.
Jan 22 17:22:50 np0005592767 systemd-machined[153912]: Machine qemu-23-instance-00000029 terminated.
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.870 182627 INFO nova.virt.libvirt.driver [-] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Instance destroyed successfully.#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.871 182627 DEBUG nova.objects.instance [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.891 182627 INFO nova.virt.libvirt.driver [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Deleting instance files /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935_del#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.899 182627 INFO nova.virt.libvirt.driver [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Deletion of /var/lib/nova/instances/8577d985-21ea-4107-ba03-87076f31b935_del complete#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.972 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.972 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:22:50 np0005592767 nova_compute[182623]: 2026-01-22 22:22:50.987 182627 DEBUG nova.objects.instance [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid 8577d985-21ea-4107-ba03-87076f31b935 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:22:51 np0005592767 nova_compute[182623]: 2026-01-22 22:22:51.061 182627 DEBUG nova.compute.provider_tree [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:22:51 np0005592767 nova_compute[182623]: 2026-01-22 22:22:51.078 182627 DEBUG nova.scheduler.client.report [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:22:51 np0005592767 nova_compute[182623]: 2026-01-22 22:22:51.122 182627 DEBUG oslo_concurrency.lockutils [None req-7121858d-07b6-4609-a372-a61f4c1eb2bb 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:22:53 np0005592767 nova_compute[182623]: 2026-01-22 22:22:53.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:55 np0005592767 nova_compute[182623]: 2026-01-22 22:22:55.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:56 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:22:56 np0005592767 systemd[217066]: Activating special unit Exit the Session...
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped target Main User Target.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped target Basic System.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped target Paths.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped target Sockets.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped target Timers.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:22:56 np0005592767 systemd[217066]: Closed D-Bus User Message Bus Socket.
Jan 22 17:22:56 np0005592767 systemd[217066]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:22:56 np0005592767 systemd[217066]: Removed slice User Application Slice.
Jan 22 17:22:56 np0005592767 systemd[217066]: Reached target Shutdown.
Jan 22 17:22:56 np0005592767 systemd[217066]: Finished Exit the Session.
Jan 22 17:22:56 np0005592767 systemd[217066]: Reached target Exit the Session.
Jan 22 17:22:56 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:22:56 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:22:56 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:22:56 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:22:56 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:22:56 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:22:56 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:22:57 np0005592767 podman[217153]: 2026-01-22 22:22:57.215680975 +0000 UTC m=+0.112848574 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:22:58 np0005592767 nova_compute[182623]: 2026-01-22 22:22:58.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:22:59 np0005592767 nova_compute[182623]: 2026-01-22 22:22:59.596 182627 DEBUG nova.virt.libvirt.driver [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:23:00 np0005592767 nova_compute[182623]: 2026-01-22 22:23:00.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:01 np0005592767 podman[217174]: 2026-01-22 22:23:01.191102186 +0000 UTC m=+0.096662514 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41)
Jan 22 17:23:01 np0005592767 podman[217173]: 2026-01-22 22:23:01.203597771 +0000 UTC m=+0.120799030 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:23:01 np0005592767 kernel: tapb5915651-92 (unregistering): left promiscuous mode
Jan 22 17:23:01 np0005592767 NetworkManager[54973]: <info>  [1769120581.8550] device (tapb5915651-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:23:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:01Z|00156|binding|INFO|Releasing lport b5915651-92b6-4103-a6ab-70b90dfac2b8 from this chassis (sb_readonly=0)
Jan 22 17:23:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:01Z|00157|binding|INFO|Setting lport b5915651-92b6-4103-a6ab-70b90dfac2b8 down in Southbound
Jan 22 17:23:01 np0005592767 nova_compute[182623]: 2026-01-22 22:23:01.866 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:01Z|00158|binding|INFO|Removing iface tapb5915651-92 ovn-installed in OVS
Jan 22 17:23:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:01.878 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:33:13 10.100.0.12'], port_security=['fa:16:3e:a9:33:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87d4ec2c-3bc5-4c68-826d-9403021dd81a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b5915651-92b6-4103-a6ab-70b90dfac2b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:01.883 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b5915651-92b6-4103-a6ab-70b90dfac2b8 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis#033[00m
Jan 22 17:23:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:01.886 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:23:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:01.891 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1f6b5a23-6a22-4a15-b1e4-831ef16bf56a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:01.893 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace which is not needed anymore#033[00m
Jan 22 17:23:01 np0005592767 nova_compute[182623]: 2026-01-22 22:23:01.902 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:01 np0005592767 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 22 17:23:01 np0005592767 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Consumed 13.528s CPU time.
Jan 22 17:23:01 np0005592767 systemd-machined[153912]: Machine qemu-22-instance-0000002b terminated.
Jan 22 17:23:02 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [NOTICE]   (216985) : haproxy version is 2.8.14-c23fe91
Jan 22 17:23:02 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [NOTICE]   (216985) : path to executable is /usr/sbin/haproxy
Jan 22 17:23:02 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [WARNING]  (216985) : Exiting Master process...
Jan 22 17:23:02 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [ALERT]    (216985) : Current worker (216987) exited with code 143 (Terminated)
Jan 22 17:23:02 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[216981]: [WARNING]  (216985) : All workers exited. Exiting... (0)
Jan 22 17:23:02 np0005592767 kernel: tapb5915651-92: entered promiscuous mode
Jan 22 17:23:02 np0005592767 kernel: tapb5915651-92 (unregistering): left promiscuous mode
Jan 22 17:23:02 np0005592767 NetworkManager[54973]: <info>  [1769120582.0933] manager: (tapb5915651-92): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 22 17:23:02 np0005592767 systemd[1]: libpod-a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f.scope: Deactivated successfully.
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.094 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:02Z|00159|binding|INFO|Claiming lport b5915651-92b6-4103-a6ab-70b90dfac2b8 for this chassis.
Jan 22 17:23:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:02Z|00160|binding|INFO|b5915651-92b6-4103-a6ab-70b90dfac2b8: Claiming fa:16:3e:a9:33:13 10.100.0.12
Jan 22 17:23:02 np0005592767 podman[217246]: 2026-01-22 22:23:02.105734456 +0000 UTC m=+0.069968787 container died a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.106 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:33:13 10.100.0.12'], port_security=['fa:16:3e:a9:33:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87d4ec2c-3bc5-4c68-826d-9403021dd81a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b5915651-92b6-4103-a6ab-70b90dfac2b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.113 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:02Z|00161|binding|INFO|Releasing lport b5915651-92b6-4103-a6ab-70b90dfac2b8 from this chassis (sb_readonly=0)
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.120 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:33:13 10.100.0.12'], port_security=['fa:16:3e:a9:33:13 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87d4ec2c-3bc5-4c68-826d-9403021dd81a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b5915651-92b6-4103-a6ab-70b90dfac2b8) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.135 182627 DEBUG nova.compute.manager [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-vif-unplugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.135 182627 DEBUG oslo_concurrency.lockutils [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.135 182627 DEBUG oslo_concurrency.lockutils [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.135 182627 DEBUG oslo_concurrency.lockutils [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.136 182627 DEBUG nova.compute.manager [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] No waiting events found dispatching network-vif-unplugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.136 182627 WARNING nova.compute.manager [req-00d8a9f6-61c1-44a9-9a65-040a5ebdff5a req-66adb190-c987-46b4-a27d-0e95fdf0fd97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received unexpected event network-vif-unplugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 for instance with vm_state active and task_state powering-off.#033[00m
Jan 22 17:23:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:23:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay-1bb5ddfe664bc5fe31ebc8709bb41d73bd8451e82c2c799aab40a20c2c276452-merged.mount: Deactivated successfully.
Jan 22 17:23:02 np0005592767 podman[217246]: 2026-01-22 22:23:02.164451672 +0000 UTC m=+0.128686003 container cleanup a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:23:02 np0005592767 systemd[1]: libpod-conmon-a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f.scope: Deactivated successfully.
Jan 22 17:23:02 np0005592767 podman[217291]: 2026-01-22 22:23:02.22533768 +0000 UTC m=+0.040496130 container remove a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.229 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ad115afd-65a5-45fa-97e3-c4d292737f22]: (4, ('Thu Jan 22 10:23:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f)\na7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f\nThu Jan 22 10:23:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (a7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f)\na7fcb94f39038b790af2a96f07f5de1bd0ad69963884269045e1e971ff48512f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.231 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[409e5cba-50d2-409c-9aec-4defcf190308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.232 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:02 np0005592767 kernel: tapdd5f6392-b0: left promiscuous mode
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.266 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.267 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.273 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c624f740-1877-48c0-8bf1-c42d61d92c15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.290 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc977e22-af4d-430d-bcb6-dc3e645fa004]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.291 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[67153a48-d1f3-4f73-ba1f-011a68bf1907]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.307 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d2d001-c2a3-434c-ac20-5136cd7b6183]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415065, 'reachable_time': 40142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217311, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 systemd[1]: run-netns-ovnmeta\x2ddd5f6392\x2dbfb2\x2d42bf\x2da825\x2dc0516c8891b0.mount: Deactivated successfully.
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.312 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.313 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bcccb52f-958c-4368-a50b-1335e117a71d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.313 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b5915651-92b6-4103-a6ab-70b90dfac2b8 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.314 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.315 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdf54f4-416d-47d7-bc1f-66e27b277888]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.315 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b5915651-92b6-4103-a6ab-70b90dfac2b8 in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.316 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:23:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:02.317 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7b4590-b5a7-40eb-ae4b-c07783786d01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.616 182627 INFO nova.virt.libvirt.driver [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance shutdown successfully after 24 seconds.#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.622 182627 INFO nova.virt.libvirt.driver [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance destroyed successfully.#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.622 182627 DEBUG nova.objects.instance [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'numa_topology' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.637 182627 DEBUG nova.compute.manager [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:02 np0005592767 nova_compute[182623]: 2026-01-22 22:23:02.712 182627 DEBUG oslo_concurrency.lockutils [None req-033d5d93-8b8a-4244-8f27-c9674ec4bb06 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:03 np0005592767 nova_compute[182623]: 2026-01-22 22:23:03.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:03 np0005592767 nova_compute[182623]: 2026-01-22 22:23:03.887 182627 DEBUG nova.compute.manager [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.004 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.005 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.029 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.045 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.046 182627 INFO nova.compute.claims [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.046 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'resources' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.057 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.067 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.104 182627 INFO nova.compute.resource_tracker [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating resource usage from migration 2ed6e609-89da-4417-b04a-071e732e3e04#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.105 182627 DEBUG nova.compute.resource_tracker [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting to track incoming migration 2ed6e609-89da-4417-b04a-071e732e3e04 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.212 182627 DEBUG nova.compute.provider_tree [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.225 182627 DEBUG nova.scheduler.client.report [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.243 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.243 182627 INFO nova.compute.manager [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Migrating#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.290 182627 DEBUG nova.compute.manager [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.290 182627 DEBUG oslo_concurrency.lockutils [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.291 182627 DEBUG oslo_concurrency.lockutils [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.291 182627 DEBUG oslo_concurrency.lockutils [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.291 182627 DEBUG nova.compute.manager [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] No waiting events found dispatching network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:23:04 np0005592767 nova_compute[182623]: 2026-01-22 22:23:04.292 182627 WARNING nova.compute.manager [req-8f212fe3-3062-4bd9-abc7-69fd96084a13 req-ec8cdcdf-6efa-427e-b1b4-c169062bb59d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received unexpected event network-vif-plugged-b5915651-92b6-4103-a6ab-70b90dfac2b8 for instance with vm_state stopped and task_state None.#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.321 182627 DEBUG nova.compute.manager [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:05 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:23:05 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:23:05 np0005592767 systemd-logind[802]: New session 45 of user nova.
Jan 22 17:23:05 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.383 182627 INFO nova.compute.manager [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] instance snapshotting#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.385 182627 WARNING nova.compute.manager [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Jan 22 17:23:05 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:23:05 np0005592767 systemd[217316]: Queued start job for default target Main User Target.
Jan 22 17:23:05 np0005592767 systemd[217316]: Created slice User Application Slice.
Jan 22 17:23:05 np0005592767 systemd[217316]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:23:05 np0005592767 systemd[217316]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:23:05 np0005592767 systemd[217316]: Reached target Paths.
Jan 22 17:23:05 np0005592767 systemd[217316]: Reached target Timers.
Jan 22 17:23:05 np0005592767 systemd[217316]: Starting D-Bus User Message Bus Socket...
Jan 22 17:23:05 np0005592767 systemd[217316]: Starting Create User's Volatile Files and Directories...
Jan 22 17:23:05 np0005592767 systemd[217316]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:23:05 np0005592767 systemd[217316]: Reached target Sockets.
Jan 22 17:23:05 np0005592767 systemd[217316]: Finished Create User's Volatile Files and Directories.
Jan 22 17:23:05 np0005592767 systemd[217316]: Reached target Basic System.
Jan 22 17:23:05 np0005592767 systemd[217316]: Reached target Main User Target.
Jan 22 17:23:05 np0005592767 systemd[217316]: Startup finished in 129ms.
Jan 22 17:23:05 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:23:05 np0005592767 systemd[1]: Started Session 45 of User nova.
Jan 22 17:23:05 np0005592767 systemd[1]: session-45.scope: Deactivated successfully.
Jan 22 17:23:05 np0005592767 systemd-logind[802]: Session 45 logged out. Waiting for processes to exit.
Jan 22 17:23:05 np0005592767 systemd-logind[802]: Removed session 45.
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.724 182627 INFO nova.virt.libvirt.driver [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Beginning cold snapshot process#033[00m
Jan 22 17:23:05 np0005592767 systemd-logind[802]: New session 47 of user nova.
Jan 22 17:23:05 np0005592767 systemd[1]: Started Session 47 of User nova.
Jan 22 17:23:05 np0005592767 systemd[1]: session-47.scope: Deactivated successfully.
Jan 22 17:23:05 np0005592767 systemd-logind[802]: Session 47 logged out. Waiting for processes to exit.
Jan 22 17:23:05 np0005592767 systemd-logind[802]: Removed session 47.
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.869 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120570.8676634, 8577d985-21ea-4107-ba03-87076f31b935 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.869 182627 INFO nova.compute.manager [-] [instance: 8577d985-21ea-4107-ba03-87076f31b935] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.904 182627 DEBUG nova.compute.manager [None req-5f85735c-960f-4356-af94-cf4b8a7dc782 - - - - - -] [instance: 8577d985-21ea-4107-ba03-87076f31b935] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.953 182627 DEBUG nova.privsep.utils [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:23:05 np0005592767 nova_compute[182623]: 2026-01-22 22:23:05.954 182627 DEBUG oslo_concurrency.processutils [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk /var/lib/nova/instances/snapshots/tmpmhjsis_h/d8c9872f8018486b8021d9d277fae5b4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:06 np0005592767 nova_compute[182623]: 2026-01-22 22:23:06.332 182627 DEBUG oslo_concurrency.processutils [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a/disk /var/lib/nova/instances/snapshots/tmpmhjsis_h/d8c9872f8018486b8021d9d277fae5b4" returned: 0 in 0.378s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:06 np0005592767 nova_compute[182623]: 2026-01-22 22:23:06.334 182627 INFO nova.virt.libvirt.driver [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:23:08 np0005592767 podman[217348]: 2026-01-22 22:23:08.140774252 +0000 UTC m=+0.056934517 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:23:08 np0005592767 nova_compute[182623]: 2026-01-22 22:23:08.155 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:08 np0005592767 podman[217349]: 2026-01-22 22:23:08.169350193 +0000 UTC m=+0.080252059 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:23:08 np0005592767 nova_compute[182623]: 2026-01-22 22:23:08.661 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:08 np0005592767 nova_compute[182623]: 2026-01-22 22:23:08.884 182627 INFO nova.virt.libvirt.driver [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Snapshot image upload complete#033[00m
Jan 22 17:23:08 np0005592767 nova_compute[182623]: 2026-01-22 22:23:08.885 182627 INFO nova.compute.manager [None req-c2a89900-e315-470c-b373-db7aacc23cfa 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Took 3.49 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:23:09 np0005592767 nova_compute[182623]: 2026-01-22 22:23:09.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:09 np0005592767 nova_compute[182623]: 2026-01-22 22:23:09.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:10 np0005592767 nova_compute[182623]: 2026-01-22 22:23:10.077 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:10 np0005592767 nova_compute[182623]: 2026-01-22 22:23:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:10 np0005592767 nova_compute[182623]: 2026-01-22 22:23:10.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:23:10 np0005592767 nova_compute[182623]: 2026-01-22 22:23:10.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.126 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.127 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.127 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.127 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.847 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.848 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.848 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.849 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.849 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.865 182627 INFO nova.compute.manager [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Terminating instance#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.879 182627 DEBUG nova.compute.manager [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.888 182627 INFO nova.virt.libvirt.driver [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Instance destroyed successfully.#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.889 182627 DEBUG nova.objects.instance [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'resources' on Instance uuid 87d4ec2c-3bc5-4c68-826d-9403021dd81a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.905 182627 DEBUG nova.virt.libvirt.vif [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:22:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1916328514',display_name='tempest-ImagesTestJSON-server-1916328514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1916328514',id=43,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:22:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-7bstce2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:23:08Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=87d4ec2c-3bc5-4c68-826d-9403021dd81a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.905 182627 DEBUG nova.network.os_vif_util [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "address": "fa:16:3e:a9:33:13", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5915651-92", "ovs_interfaceid": "b5915651-92b6-4103-a6ab-70b90dfac2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.907 182627 DEBUG nova.network.os_vif_util [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.908 182627 DEBUG os_vif [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.913 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5915651-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.918 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.923 182627 INFO os_vif [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:33:13,bridge_name='br-int',has_traffic_filtering=True,id=b5915651-92b6-4103-a6ab-70b90dfac2b8,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5915651-92')#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.924 182627 INFO nova.virt.libvirt.driver [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Deleting instance files /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a_del#033[00m
Jan 22 17:23:11 np0005592767 nova_compute[182623]: 2026-01-22 22:23:11.935 182627 INFO nova.virt.libvirt.driver [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Deletion of /var/lib/nova/instances/87d4ec2c-3bc5-4c68-826d-9403021dd81a_del complete#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.052 182627 INFO nova.compute.manager [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Took 0.17 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.053 182627 DEBUG oslo.service.loopingcall [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.053 182627 DEBUG nova.compute.manager [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.054 182627 DEBUG nova.network.neutron [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:23:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:12.094 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:12.095 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:12.095 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.106 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.439 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.465 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.466 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.466 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.466 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.466 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.700 182627 DEBUG nova.network.neutron [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.720 182627 INFO nova.compute.manager [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Took 0.67 seconds to deallocate network for instance.#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.790 182627 DEBUG nova.compute.manager [req-ae9f519d-f005-4837-a6f7-1b684ea83b53 req-1d40fd1b-33df-48ed-9807-fb48669c494d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Received event network-vif-deleted-b5915651-92b6-4103-a6ab-70b90dfac2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.872 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.873 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.972 182627 DEBUG nova.compute.provider_tree [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:12 np0005592767 nova_compute[182623]: 2026-01-22 22:23:12.990 182627 DEBUG nova.scheduler.client.report [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:13 np0005592767 nova_compute[182623]: 2026-01-22 22:23:13.013 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:13 np0005592767 nova_compute[182623]: 2026-01-22 22:23:13.043 182627 INFO nova.scheduler.client.report [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Deleted allocations for instance 87d4ec2c-3bc5-4c68-826d-9403021dd81a#033[00m
Jan 22 17:23:13 np0005592767 nova_compute[182623]: 2026-01-22 22:23:13.189 182627 DEBUG oslo_concurrency.lockutils [None req-a13d1e93-84b6-44c1-816e-2bba2aea16f2 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "87d4ec2c-3bc5-4c68-826d-9403021dd81a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:13 np0005592767 nova_compute[182623]: 2026-01-22 22:23:13.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:14 np0005592767 nova_compute[182623]: 2026-01-22 22:23:14.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:23:14 np0005592767 nova_compute[182623]: 2026-01-22 22:23:14.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:14 np0005592767 nova_compute[182623]: 2026-01-22 22:23:14.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:14 np0005592767 nova_compute[182623]: 2026-01-22 22:23:14.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:14 np0005592767 nova_compute[182623]: 2026-01-22 22:23:14.921 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.033 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:15 np0005592767 podman[217390]: 2026-01-22 22:23:15.071381348 +0000 UTC m=+0.083460051 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.079 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.124 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.125 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.211 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.360 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.361 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5509MB free_disk=73.24551773071289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.361 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.362 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.407 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration for instance 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.430 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating resource usage from migration 2ed6e609-89da-4417-b04a-071e732e3e04#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.431 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting to track incoming migration 2ed6e609-89da-4417-b04a-071e732e3e04 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.457 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.483 182627 WARNING nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.483 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.483 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.549 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.565 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.585 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:23:15 np0005592767 nova_compute[182623]: 2026-01-22 22:23:15.586 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:16 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:23:16 np0005592767 systemd[217316]: Activating special unit Exit the Session...
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped target Main User Target.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped target Basic System.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped target Paths.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped target Sockets.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped target Timers.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:23:16 np0005592767 systemd[217316]: Closed D-Bus User Message Bus Socket.
Jan 22 17:23:16 np0005592767 systemd[217316]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:23:16 np0005592767 systemd[217316]: Removed slice User Application Slice.
Jan 22 17:23:16 np0005592767 systemd[217316]: Reached target Shutdown.
Jan 22 17:23:16 np0005592767 systemd[217316]: Finished Exit the Session.
Jan 22 17:23:16 np0005592767 systemd[217316]: Reached target Exit the Session.
Jan 22 17:23:16 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:23:16 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:23:16 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:23:16 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:23:16 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:23:16 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:23:16 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:23:16 np0005592767 nova_compute[182623]: 2026-01-22 22:23:16.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:17 np0005592767 nova_compute[182623]: 2026-01-22 22:23:17.151 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120582.149834, 87d4ec2c-3bc5-4c68-826d-9403021dd81a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:17 np0005592767 nova_compute[182623]: 2026-01-22 22:23:17.152 182627 INFO nova.compute.manager [-] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:23:17 np0005592767 nova_compute[182623]: 2026-01-22 22:23:17.171 182627 DEBUG nova.compute.manager [None req-d49defa5-e86b-46ba-a745-1afb4a0cfb3a - - - - - -] [instance: 87d4ec2c-3bc5-4c68-826d-9403021dd81a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:19 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:23:19 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:23:19 np0005592767 systemd-logind[802]: New session 48 of user nova.
Jan 22 17:23:19 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:23:19 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:23:19 np0005592767 systemd[217425]: Queued start job for default target Main User Target.
Jan 22 17:23:19 np0005592767 systemd[217425]: Created slice User Application Slice.
Jan 22 17:23:19 np0005592767 systemd[217425]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:23:19 np0005592767 systemd[217425]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:23:19 np0005592767 systemd[217425]: Reached target Paths.
Jan 22 17:23:19 np0005592767 systemd[217425]: Reached target Timers.
Jan 22 17:23:19 np0005592767 systemd[217425]: Starting D-Bus User Message Bus Socket...
Jan 22 17:23:19 np0005592767 systemd[217425]: Starting Create User's Volatile Files and Directories...
Jan 22 17:23:19 np0005592767 systemd[217425]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:23:19 np0005592767 systemd[217425]: Reached target Sockets.
Jan 22 17:23:19 np0005592767 systemd[217425]: Finished Create User's Volatile Files and Directories.
Jan 22 17:23:19 np0005592767 systemd[217425]: Reached target Basic System.
Jan 22 17:23:19 np0005592767 systemd[217425]: Reached target Main User Target.
Jan 22 17:23:19 np0005592767 systemd[217425]: Startup finished in 141ms.
Jan 22 17:23:19 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:23:19 np0005592767 systemd[1]: Started Session 48 of User nova.
Jan 22 17:23:19 np0005592767 systemd[1]: session-48.scope: Deactivated successfully.
Jan 22 17:23:19 np0005592767 systemd-logind[802]: Session 48 logged out. Waiting for processes to exit.
Jan 22 17:23:19 np0005592767 systemd-logind[802]: Removed session 48.
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.081 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:20 np0005592767 systemd-logind[802]: New session 50 of user nova.
Jan 22 17:23:20 np0005592767 systemd[1]: Started Session 50 of User nova.
Jan 22 17:23:20 np0005592767 systemd[1]: session-50.scope: Deactivated successfully.
Jan 22 17:23:20 np0005592767 systemd-logind[802]: Session 50 logged out. Waiting for processes to exit.
Jan 22 17:23:20 np0005592767 systemd-logind[802]: Removed session 50.
Jan 22 17:23:20 np0005592767 systemd-logind[802]: New session 51 of user nova.
Jan 22 17:23:20 np0005592767 systemd[1]: Started Session 51 of User nova.
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.412 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.412 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.429 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:23:20 np0005592767 systemd[1]: session-51.scope: Deactivated successfully.
Jan 22 17:23:20 np0005592767 systemd-logind[802]: Session 51 logged out. Waiting for processes to exit.
Jan 22 17:23:20 np0005592767 systemd-logind[802]: Removed session 51.
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.549 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.549 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.557 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.557 182627 INFO nova.compute.claims [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.737 182627 DEBUG nova.compute.provider_tree [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.752 182627 DEBUG nova.scheduler.client.report [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.777 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.778 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.857 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.858 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.883 182627 INFO nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.891 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.891 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquired lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.892 182627 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:23:20 np0005592767 nova_compute[182623]: 2026-01-22 22:23:20.914 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.064 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.065 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.066 182627 INFO nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Creating image(s)#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.066 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.066 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.067 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.079 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.101 182627 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.105 182627 DEBUG nova.policy [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eed342926fef4463ad00939574b29d92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'edab01dca3144ceaaefaf47054f047c3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.164 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.164 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.165 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.176 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.242 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.244 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.275 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.276 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.277 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.331 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.332 182627 DEBUG nova.virt.disk.api [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Checking if we can resize image /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.333 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.382 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.383 182627 DEBUG nova.virt.disk.api [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Cannot resize image /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.384 182627 DEBUG nova.objects.instance [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lazy-loading 'migration_context' on Instance uuid 533d47fa-356e-452a-9c5b-734a1d5ad7eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.400 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.401 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Ensure instance console log exists: /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.401 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.402 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.402 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:21 np0005592767 nova_compute[182623]: 2026-01-22 22:23:21.920 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.084 182627 DEBUG nova.network.neutron [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.099 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Releasing lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.235 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.237 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.237 182627 INFO nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Creating image(s)#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.238 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.254 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.311 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.312 182627 DEBUG nova.virt.disk.api [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Checking if we can resize image /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.312 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.370 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.371 182627 DEBUG nova.virt.disk.api [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Cannot resize image /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.385 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.386 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Ensure instance console log exists: /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.386 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.387 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.387 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.389 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.393 182627 WARNING nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.397 182627 DEBUG nova.virt.libvirt.host [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.398 182627 DEBUG nova.virt.libvirt.host [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.400 182627 DEBUG nova.virt.libvirt.host [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.401 182627 DEBUG nova.virt.libvirt.host [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.402 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.402 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.403 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.403 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.403 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.403 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.404 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.404 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.404 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.405 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.405 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.405 182627 DEBUG nova.virt.hardware [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.405 182627 DEBUG nova.objects.instance [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.421 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.500 182627 DEBUG oslo_concurrency.processutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.502 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Acquiring lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.503 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.504 182627 DEBUG oslo_concurrency.lockutils [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] Lock "/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.509 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <uuid>3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</uuid>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <name>instance-0000002d</name>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:name>tempest-MigrationsAdminTest-server-1010543436</nova:name>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:23:22</nova:creationTime>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:user uuid="8ca7b75a121d4858bc8d282f0c6728e0">tempest-MigrationsAdminTest-381257806-project-member</nova:user>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:        <nova:project uuid="e5385c77364a4925bcdfff2bd744eb0b">tempest-MigrationsAdminTest-381257806</nova:project>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="serial">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="uuid">3791850d-1b20-4ce4-8ae7-e5a9bc427bf5</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/disk.config"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/console.log" append="off"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:23:22 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:23:22 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:23:22 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:23:22 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.559 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.560 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:22 np0005592767 nova_compute[182623]: 2026-01-22 22:23:22.561 182627 INFO nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Using config drive#033[00m
Jan 22 17:23:22 np0005592767 systemd-machined[153912]: New machine qemu-24-instance-0000002d.
Jan 22 17:23:22 np0005592767 systemd[1]: Started Virtual Machine qemu-24-instance-0000002d.
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.287 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120603.2870767, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.289 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.292 182627 DEBUG nova.compute.manager [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.296 182627 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance running successfully.#033[00m
Jan 22 17:23:23 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.299 182627 DEBUG nova.virt.libvirt.guest [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.299 182627 DEBUG nova.virt.libvirt.driver [None req-cca285e1-c19f-4a38-a4e7-4439c6dfe40f f00d60626e70458691fcee0da863d8bf 8b058ced40854682ba0d287cb0d1c241 - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.313 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.317 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.362 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.363 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120603.2890751, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.364 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Started (Lifecycle Event)#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.397 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:23 np0005592767 nova_compute[182623]: 2026-01-22 22:23:23.402 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:23:24 np0005592767 nova_compute[182623]: 2026-01-22 22:23:24.177 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Successfully created port: bee11368-9daa-4202-adae-f89264cf9f5f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.083 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.552 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.553 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.553 182627 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.683 182627 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.962 182627 DEBUG nova.network.neutron [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.978 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-3791850d-1b20-4ce4-8ae7-e5a9bc427bf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:25 np0005592767 nova_compute[182623]: 2026-01-22 22:23:25.991 182627 DEBUG nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Creating tmpfile /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5/tmpscgxhvsx to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Jan 22 17:23:26 np0005592767 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 22 17:23:26 np0005592767 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002d.scope: Consumed 3.337s CPU time.
Jan 22 17:23:26 np0005592767 systemd-machined[153912]: Machine qemu-24-instance-0000002d terminated.
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.134 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Successfully updated port: bee11368-9daa-4202-adae-f89264cf9f5f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.149 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.149 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquired lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.150 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.256 182627 INFO nova.virt.libvirt.driver [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Instance destroyed successfully.#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.256 182627 DEBUG nova.objects.instance [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.275 182627 INFO nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Deleting instance files /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_del#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.285 182627 INFO nova.virt.libvirt.driver [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Deletion of /var/lib/nova/instances/3791850d-1b20-4ce4-8ae7-e5a9bc427bf5_del complete#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.332 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.360 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.361 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.376 182627 DEBUG nova.objects.instance [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'migration_context' on Instance uuid 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.459 182627 DEBUG nova.compute.provider_tree [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.472 182627 DEBUG nova.scheduler.client.report [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.521 182627 DEBUG oslo_concurrency.lockutils [None req-f9ae7acc-73d2-47be-a424-aded7a5b8588 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:26 np0005592767 nova_compute[182623]: 2026-01-22 22:23:26.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:27 np0005592767 nova_compute[182623]: 2026-01-22 22:23:27.377 182627 DEBUG nova.compute.manager [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-changed-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:27 np0005592767 nova_compute[182623]: 2026-01-22 22:23:27.378 182627 DEBUG nova.compute.manager [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Refreshing instance network info cache due to event network-changed-bee11368-9daa-4202-adae-f89264cf9f5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:23:27 np0005592767 nova_compute[182623]: 2026-01-22 22:23:27.378 182627 DEBUG oslo_concurrency.lockutils [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.074 182627 DEBUG nova.network.neutron [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updating instance_info_cache with network_info: [{"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.094 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Releasing lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.094 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Instance network_info: |[{"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.095 182627 DEBUG oslo_concurrency.lockutils [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.095 182627 DEBUG nova.network.neutron [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Refreshing network info cache for port bee11368-9daa-4202-adae-f89264cf9f5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.098 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Start _get_guest_xml network_info=[{"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.103 182627 WARNING nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.116 182627 DEBUG nova.virt.libvirt.host [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.117 182627 DEBUG nova.virt.libvirt.host [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.120 182627 DEBUG nova.virt.libvirt.host [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.121 182627 DEBUG nova.virt.libvirt.host [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.122 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.122 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.122 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.123 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.123 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.123 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.123 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.123 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.124 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.124 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.124 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.124 182627 DEBUG nova.virt.hardware [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.128 182627 DEBUG nova.virt.libvirt.vif [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=47,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMvA/CPL6TIa31HSZuHKOZSlos40RbzRicpVro4690k5p7eowVR40zEGUgSky3gqj7HWKRiQdbi7QE8X8Zam3PCiqbLUk+Xw5w+YtiEFzjWD3vAAqJ2HE7dVPPvoP6k/dQ==',key_name='tempest-keypair-321443532',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='edab01dca3144ceaaefaf47054f047c3',ramdisk_id='',reservation_id='r-fk4bm1xm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-1419442983',owner_user_name='tempest-ServersV294TestFqdnHostnames-1419442983-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eed342926fef4463ad00939574b29d92',uuid=533d47fa-356e-452a-9c5b-734a1d5ad7eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.128 182627 DEBUG nova.network.os_vif_util [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converting VIF {"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.129 182627 DEBUG nova.network.os_vif_util [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.130 182627 DEBUG nova.objects.instance [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 533d47fa-356e-452a-9c5b-734a1d5ad7eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.148 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <uuid>533d47fa-356e-452a-9c5b-734a1d5ad7eb</uuid>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <name>instance-0000002f</name>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:name>guest-instance-1</nova:name>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:23:28</nova:creationTime>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:user uuid="eed342926fef4463ad00939574b29d92">tempest-ServersV294TestFqdnHostnames-1419442983-project-member</nova:user>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:project uuid="edab01dca3144ceaaefaf47054f047c3">tempest-ServersV294TestFqdnHostnames-1419442983</nova:project>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        <nova:port uuid="bee11368-9daa-4202-adae-f89264cf9f5f">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="serial">533d47fa-356e-452a-9c5b-734a1d5ad7eb</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="uuid">533d47fa-356e-452a-9c5b-734a1d5ad7eb</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.config"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:5e:de:22"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <target dev="tapbee11368-9d"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/console.log" append="off"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:23:28 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:23:28 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:23:28 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:23:28 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.148 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Preparing to wait for external event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.148 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.148 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.149 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.149 182627 DEBUG nova.virt.libvirt.vif [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=47,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMvA/CPL6TIa31HSZuHKOZSlos40RbzRicpVro4690k5p7eowVR40zEGUgSky3gqj7HWKRiQdbi7QE8X8Zam3PCiqbLUk+Xw5w+YtiEFzjWD3vAAqJ2HE7dVPPvoP6k/dQ==',key_name='tempest-keypair-321443532',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='edab01dca3144ceaaefaf47054f047c3',ramdisk_id='',reservation_id='r-fk4bm1xm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-1419442983',owner_user_name='tempest-ServersV294TestFqdnHostnames-1419442983-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eed342926fef4463ad00939574b29d92',uuid=533d47fa-356e-452a-9c5b-734a1d5ad7eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.149 182627 DEBUG nova.network.os_vif_util [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converting VIF {"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.150 182627 DEBUG nova.network.os_vif_util [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.150 182627 DEBUG os_vif [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.154 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.155 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbee11368-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.155 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbee11368-9d, col_values=(('external_ids', {'iface-id': 'bee11368-9daa-4202-adae-f89264cf9f5f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:de:22', 'vm-uuid': '533d47fa-356e-452a-9c5b-734a1d5ad7eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:28 np0005592767 NetworkManager[54973]: <info>  [1769120608.1581] manager: (tapbee11368-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.165 182627 INFO os_vif [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d')#033[00m
Jan 22 17:23:28 np0005592767 podman[217522]: 2026-01-22 22:23:28.20069584 +0000 UTC m=+0.113652570 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.240 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.241 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.241 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] No VIF found with MAC fa:16:3e:5e:de:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.242 182627 INFO nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Using config drive#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.913 182627 INFO nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Creating config drive at /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.config#033[00m
Jan 22 17:23:28 np0005592767 nova_compute[182623]: 2026-01-22 22:23:28.926 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphlhxf9u3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.071 182627 DEBUG oslo_concurrency.processutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphlhxf9u3" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:29 np0005592767 kernel: tapbee11368-9d: entered promiscuous mode
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.1624] manager: (tapbee11368-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:29Z|00162|binding|INFO|Claiming lport bee11368-9daa-4202-adae-f89264cf9f5f for this chassis.
Jan 22 17:23:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:29Z|00163|binding|INFO|bee11368-9daa-4202-adae-f89264cf9f5f: Claiming fa:16:3e:5e:de:22 10.100.0.4
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.185 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:de:22 10.100.0.4'], port_security=['fa:16:3e:5e:de:22 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '533d47fa-356e-452a-9c5b-734a1d5ad7eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'edab01dca3144ceaaefaf47054f047c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6ad0df2d-5051-47de-9b75-7536701cede4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95f8671b-f5f0-41e8-b064-1402ad421053, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=bee11368-9daa-4202-adae-f89264cf9f5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.186 104135 INFO neutron.agent.ovn.metadata.agent [-] Port bee11368-9daa-4202-adae-f89264cf9f5f in datapath dc7d5d50-f986-460a-be6f-94d81f5e7124 bound to our chassis#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.188 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc7d5d50-f986-460a-be6f-94d81f5e7124#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.207 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8a94cf37-b713-4da0-ab54-1b8219abb401]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.208 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc7d5d50-f1 in ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.210 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc7d5d50-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.210 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c1997846-4e3f-4c4b-928d-59fa8e42746f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 systemd-udevd[217563]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.211 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d27a45e7-c2c9-430f-8da2-cc3f0954e825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.218 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:29Z|00164|binding|INFO|Setting lport bee11368-9daa-4202-adae-f89264cf9f5f ovn-installed in OVS
Jan 22 17:23:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:29Z|00165|binding|INFO|Setting lport bee11368-9daa-4202-adae-f89264cf9f5f up in Southbound
Jan 22 17:23:29 np0005592767 systemd-machined[153912]: New machine qemu-25-instance-0000002f.
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.2270] device (tapbee11368-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.2284] device (tapbee11368-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.225 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.233 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[93f21f44-0cda-4175-b1bc-158a38c20ad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 systemd[1]: Started Virtual Machine qemu-25-instance-0000002f.
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.263 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[496e326d-d7fd-4a91-a5f9-c958e8d41a7d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.301 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd7e3f0-d2eb-4027-aa24-9b7b3fd87680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.308 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e6de3823-a206-448b-8493-8b9c0c729dcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.3096] manager: (tapdc7d5d50-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Jan 22 17:23:29 np0005592767 systemd-udevd[217567]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.340 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[62dcf209-a445-415b-bf3e-cb6d3babbb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.344 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[599655a6-29d1-4cc2-a567-411f96369107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.3606] device (tapdc7d5d50-f0): carrier: link connected
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.364 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5c51e5-f807-435b-9df1-9b242c42f3ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.377 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb32261-6bd3-4d42-9fe2-e530d909e300]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc7d5d50-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:e3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420593, 'reachable_time': 27041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217596, 'error': None, 'target': 'ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.389 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ef122de6-5cd5-4d0e-8b5e-b9d9517753f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:e365'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420593, 'tstamp': 420593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217597, 'error': None, 'target': 'ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.400 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e484f871-c5df-480a-86a9-83d4676270c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc7d5d50-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:e3:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420593, 'reachable_time': 27041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217598, 'error': None, 'target': 'ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.422 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0d9486-a1d5-4a20-8d37-034dcd4fa6ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.469 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bae95b6a-a8c1-4edf-9965-3a1e910738b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc7d5d50-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc7d5d50-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:29 np0005592767 NetworkManager[54973]: <info>  [1769120609.4738] manager: (tapdc7d5d50-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 22 17:23:29 np0005592767 kernel: tapdc7d5d50-f0: entered promiscuous mode
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.474 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.477 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc7d5d50-f0, col_values=(('external_ids', {'iface-id': 'a7b543d6-e47a-4d31-8f0a-22cb63660b19'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.478 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.478 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:29Z|00166|binding|INFO|Releasing lport a7b543d6-e47a-4d31-8f0a-22cb63660b19 from this chassis (sb_readonly=0)
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.479 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc7d5d50-f986-460a-be6f-94d81f5e7124.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc7d5d50-f986-460a-be6f-94d81f5e7124.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.480 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f897930e-7e10-47f2-822f-9f9c5bd91ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.481 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-dc7d5d50-f986-460a-be6f-94d81f5e7124
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/dc7d5d50-f986-460a-be6f-94d81f5e7124.pid.haproxy
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID dc7d5d50-f986-460a-be6f-94d81f5e7124
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:23:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:29.481 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'env', 'PROCESS_TAG=haproxy-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc7d5d50-f986-460a-be6f-94d81f5e7124.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.491 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.729 182627 DEBUG nova.compute.manager [req-7432ab2b-521e-4e44-a845-3f87dc1dc9c9 req-277562b8-2190-42f6-9313-034e58c260ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.730 182627 DEBUG oslo_concurrency.lockutils [req-7432ab2b-521e-4e44-a845-3f87dc1dc9c9 req-277562b8-2190-42f6-9313-034e58c260ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.731 182627 DEBUG oslo_concurrency.lockutils [req-7432ab2b-521e-4e44-a845-3f87dc1dc9c9 req-277562b8-2190-42f6-9313-034e58c260ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.731 182627 DEBUG oslo_concurrency.lockutils [req-7432ab2b-521e-4e44-a845-3f87dc1dc9c9 req-277562b8-2190-42f6-9313-034e58c260ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:29 np0005592767 nova_compute[182623]: 2026-01-22 22:23:29.731 182627 DEBUG nova.compute.manager [req-7432ab2b-521e-4e44-a845-3f87dc1dc9c9 req-277562b8-2190-42f6-9313-034e58c260ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Processing event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:23:29 np0005592767 podman[217630]: 2026-01-22 22:23:29.851610707 +0000 UTC m=+0.062831717 container create 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:23:29 np0005592767 podman[217630]: 2026-01-22 22:23:29.81826515 +0000 UTC m=+0.029486230 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:23:29 np0005592767 systemd[1]: Started libpod-conmon-0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9.scope.
Jan 22 17:23:29 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:23:29 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccf60d35788b03f9279bb262ffa4268bff81a71a6075272d5e5dbb87e3da3992/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:23:29 np0005592767 podman[217630]: 2026-01-22 22:23:29.966392718 +0000 UTC m=+0.177613738 container init 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:23:29 np0005592767 podman[217630]: 2026-01-22 22:23:29.974877444 +0000 UTC m=+0.186098464 container start 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:23:30 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [NOTICE]   (217654) : New worker (217657) forked
Jan 22 17:23:30 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [NOTICE]   (217654) : Loading success.
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.070 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120610.069441, 533d47fa-356e-452a-9c5b-734a1d5ad7eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.070 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] VM Started (Lifecycle Event)#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.073 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.077 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.080 182627 INFO nova.virt.libvirt.driver [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Instance spawned successfully.#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.080 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.084 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.090 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.099 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.101 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.102 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.102 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.102 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.103 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.103 182627 DEBUG nova.virt.libvirt.driver [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.125 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.125 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120610.0716476, 533d47fa-356e-452a-9c5b-734a1d5ad7eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.125 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.151 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.154 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120610.07656, 533d47fa-356e-452a-9c5b-734a1d5ad7eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.154 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.169 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.171 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.218 182627 INFO nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Took 9.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.219 182627 DEBUG nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.224 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.296 182627 INFO nova.compute.manager [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Took 9.79 seconds to build instance.#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.309 182627 DEBUG oslo_concurrency.lockutils [None req-2bae754e-957f-434c-b5f4-4086e0cf8648 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.386 182627 DEBUG nova.network.neutron [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updated VIF entry in instance network info cache for port bee11368-9daa-4202-adae-f89264cf9f5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.387 182627 DEBUG nova.network.neutron [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updating instance_info_cache with network_info: [{"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:30 np0005592767 nova_compute[182623]: 2026-01-22 22:23:30.403 182627 DEBUG oslo_concurrency.lockutils [req-1ed27eae-c8cd-4281-840d-5d292f831105 req-32ded9a3-5fe8-4e38-b69a-54d93c834735 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:30 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:23:30 np0005592767 systemd[217425]: Activating special unit Exit the Session...
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped target Main User Target.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped target Basic System.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped target Paths.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped target Sockets.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped target Timers.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:23:30 np0005592767 systemd[217425]: Closed D-Bus User Message Bus Socket.
Jan 22 17:23:30 np0005592767 systemd[217425]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:23:30 np0005592767 systemd[217425]: Removed slice User Application Slice.
Jan 22 17:23:30 np0005592767 systemd[217425]: Reached target Shutdown.
Jan 22 17:23:30 np0005592767 systemd[217425]: Finished Exit the Session.
Jan 22 17:23:30 np0005592767 systemd[217425]: Reached target Exit the Session.
Jan 22 17:23:30 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:23:30 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:23:30 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:23:30 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:23:30 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:23:30 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:23:30 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.846 182627 DEBUG nova.compute.manager [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.848 182627 DEBUG oslo_concurrency.lockutils [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.848 182627 DEBUG oslo_concurrency.lockutils [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.848 182627 DEBUG oslo_concurrency.lockutils [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.849 182627 DEBUG nova.compute.manager [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] No waiting events found dispatching network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:23:31 np0005592767 nova_compute[182623]: 2026-01-22 22:23:31.849 182627 WARNING nova.compute.manager [req-9eb3bf5c-3b74-403f-a320-ac63ba09224f req-4f61e8a9-b4a9-46c3-8a55-261b1c479503 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received unexpected event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f for instance with vm_state active and task_state None.#033[00m
Jan 22 17:23:32 np0005592767 podman[217670]: 2026-01-22 22:23:32.165312107 +0000 UTC m=+0.074843260 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:23:32 np0005592767 podman[217669]: 2026-01-22 22:23:32.186682851 +0000 UTC m=+0.098160139 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 17:23:33 np0005592767 nova_compute[182623]: 2026-01-22 22:23:33.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:33 np0005592767 NetworkManager[54973]: <info>  [1769120613.2199] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 22 17:23:33 np0005592767 NetworkManager[54973]: <info>  [1769120613.2210] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 22 17:23:33 np0005592767 nova_compute[182623]: 2026-01-22 22:23:33.219 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:33 np0005592767 nova_compute[182623]: 2026-01-22 22:23:33.289 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:33Z|00167|binding|INFO|Releasing lport a7b543d6-e47a-4d31-8f0a-22cb63660b19 from this chassis (sb_readonly=0)
Jan 22 17:23:33 np0005592767 nova_compute[182623]: 2026-01-22 22:23:33.304 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.087 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.175 182627 DEBUG nova.compute.manager [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-changed-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.176 182627 DEBUG nova.compute.manager [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Refreshing instance network info cache due to event network-changed-bee11368-9daa-4202-adae-f89264cf9f5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.176 182627 DEBUG oslo_concurrency.lockutils [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.176 182627 DEBUG oslo_concurrency.lockutils [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:35 np0005592767 nova_compute[182623]: 2026-01-22 22:23:35.176 182627 DEBUG nova.network.neutron [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Refreshing network info cache for port bee11368-9daa-4202-adae-f89264cf9f5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:23:37 np0005592767 nova_compute[182623]: 2026-01-22 22:23:37.050 182627 DEBUG nova.network.neutron [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updated VIF entry in instance network info cache for port bee11368-9daa-4202-adae-f89264cf9f5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:23:37 np0005592767 nova_compute[182623]: 2026-01-22 22:23:37.050 182627 DEBUG nova.network.neutron [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updating instance_info_cache with network_info: [{"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:37 np0005592767 nova_compute[182623]: 2026-01-22 22:23:37.068 182627 DEBUG oslo_concurrency.lockutils [req-d5a0f651-a85e-4631-9ec8-15150beb3127 req-b12f3f5a-46bf-464b-a06b-54dbac4c6646 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-533d47fa-356e-452a-9c5b-734a1d5ad7eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.238 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.238 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.238 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.238 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.239 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.249 182627 INFO nova.compute.manager [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Terminating instance#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.261 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.261 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquired lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.261 182627 DEBUG nova.network.neutron [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:23:38 np0005592767 nova_compute[182623]: 2026-01-22 22:23:38.445 182627 DEBUG nova.network.neutron [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.108 182627 DEBUG nova.network.neutron [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.127 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Releasing lock "refresh_cache-ce913c81-c8b7-4b71-91b0-ec941d59dc1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.127 182627 DEBUG nova.compute.manager [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:23:39 np0005592767 podman[217715]: 2026-01-22 22:23:39.142192602 +0000 UTC m=+0.058062465 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:23:39 np0005592767 podman[217716]: 2026-01-22 22:23:39.16119244 +0000 UTC m=+0.069593225 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:23:39 np0005592767 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 22 17:23:39 np0005592767 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000027.scope: Consumed 14.855s CPU time.
Jan 22 17:23:39 np0005592767 systemd-machined[153912]: Machine qemu-20-instance-00000027 terminated.
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.381 182627 INFO nova.virt.libvirt.driver [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance destroyed successfully.#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.382 182627 DEBUG nova.objects.instance [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lazy-loading 'resources' on Instance uuid ce913c81-c8b7-4b71-91b0-ec941d59dc1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.442 182627 INFO nova.virt.libvirt.driver [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Deleting instance files /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_del#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.449 182627 INFO nova.virt.libvirt.driver [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Deletion of /var/lib/nova/instances/ce913c81-c8b7-4b71-91b0-ec941d59dc1c_del complete#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.542 182627 INFO nova.compute.manager [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.543 182627 DEBUG oslo.service.loopingcall [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.543 182627 DEBUG nova.compute.manager [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.543 182627 DEBUG nova.network.neutron [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.947 182627 DEBUG nova.network.neutron [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.972 182627 DEBUG nova.network.neutron [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:39 np0005592767 nova_compute[182623]: 2026-01-22 22:23:39.993 182627 INFO nova.compute.manager [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Took 0.45 seconds to deallocate network for instance.#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.081 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.082 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.091 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.147 182627 DEBUG nova.compute.provider_tree [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.162 182627 DEBUG nova.scheduler.client.report [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.183 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.210 182627 INFO nova.scheduler.client.report [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Deleted allocations for instance ce913c81-c8b7-4b71-91b0-ec941d59dc1c#033[00m
Jan 22 17:23:40 np0005592767 nova_compute[182623]: 2026-01-22 22:23:40.320 182627 DEBUG oslo_concurrency.lockutils [None req-aee49ec0-684a-481a-9b59-466a7d8bc918 8ca7b75a121d4858bc8d282f0c6728e0 e5385c77364a4925bcdfff2bd744eb0b - - default default] Lock "ce913c81-c8b7-4b71-91b0-ec941d59dc1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:41 np0005592767 nova_compute[182623]: 2026-01-22 22:23:41.254 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120606.2530034, 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:41 np0005592767 nova_compute[182623]: 2026-01-22 22:23:41.255 182627 INFO nova.compute.manager [-] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:23:41 np0005592767 nova_compute[182623]: 2026-01-22 22:23:41.274 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:41.274 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:41.276 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:23:41 np0005592767 nova_compute[182623]: 2026-01-22 22:23:41.278 182627 DEBUG nova.compute.manager [None req-5a5ac24a-db9d-4a8a-8c47-2fce5f98b9ce - - - - - -] [instance: 3791850d-1b20-4ce4-8ae7-e5a9bc427bf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:42Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:de:22 10.100.0.4
Jan 22 17:23:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:42Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:de:22 10.100.0.4
Jan 22 17:23:43 np0005592767 nova_compute[182623]: 2026-01-22 22:23:43.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:44.279 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:45 np0005592767 nova_compute[182623]: 2026-01-22 22:23:45.091 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:46 np0005592767 podman[217784]: 2026-01-22 22:23:46.128648991 +0000 UTC m=+0.052090989 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:23:48 np0005592767 nova_compute[182623]: 2026-01-22 22:23:48.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:49.373 104394 DEBUG eventlet.wsgi.server [-] (104394) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:49.377 104394 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: Accept: */*#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: Connection: close#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: Content-Type: text/plain#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: Host: 169.254.169.254#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: User-Agent: curl/7.84.0#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: X-Forwarded-For: 10.100.0.4#015
Jan 22 17:23:49 np0005592767 ovn_metadata_agent[104130]: X-Ovn-Network-Id: dc7d5d50-f986-460a-be6f-94d81f5e7124 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 22 17:23:50 np0005592767 nova_compute[182623]: 2026-01-22 22:23:50.094 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:50.961 104394 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 22 17:23:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:50.962 104394 INFO eventlet.wsgi.server [-] 10.100.0.4,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.5857308#033[00m
Jan 22 17:23:50 np0005592767 haproxy-metadata-proxy-dc7d5d50-f986-460a-be6f-94d81f5e7124[217657]: 10.100.0.4:53426 [22/Jan/2026:22:23:49.372] listener listener/metadata 0/0/0/1590/1590 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.240 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.241 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.241 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.241 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.242 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.252 182627 INFO nova.compute.manager [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Terminating instance#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.261 182627 DEBUG nova.compute.manager [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:23:51 np0005592767 kernel: tapbee11368-9d (unregistering): left promiscuous mode
Jan 22 17:23:51 np0005592767 NetworkManager[54973]: <info>  [1769120631.2902] device (tapbee11368-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.292 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:51Z|00168|binding|INFO|Releasing lport bee11368-9daa-4202-adae-f89264cf9f5f from this chassis (sb_readonly=0)
Jan 22 17:23:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:51Z|00169|binding|INFO|Setting lport bee11368-9daa-4202-adae-f89264cf9f5f down in Southbound
Jan 22 17:23:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:51Z|00170|binding|INFO|Removing iface tapbee11368-9d ovn-installed in OVS
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.320 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:de:22 10.100.0.4'], port_security=['fa:16:3e:5e:de:22 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '533d47fa-356e-452a-9c5b-734a1d5ad7eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'edab01dca3144ceaaefaf47054f047c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ad0df2d-5051-47de-9b75-7536701cede4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95f8671b-f5f0-41e8-b064-1402ad421053, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=bee11368-9daa-4202-adae-f89264cf9f5f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.321 104135 INFO neutron.agent.ovn.metadata.agent [-] Port bee11368-9daa-4202-adae-f89264cf9f5f in datapath dc7d5d50-f986-460a-be6f-94d81f5e7124 unbound from our chassis#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.323 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc7d5d50-f986-460a-be6f-94d81f5e7124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.326 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.326 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5a645c8e-ee62-4355-97cf-515a7d6da650]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.327 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124 namespace which is not needed anymore#033[00m
Jan 22 17:23:51 np0005592767 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 22 17:23:51 np0005592767 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Consumed 13.241s CPU time.
Jan 22 17:23:51 np0005592767 systemd-machined[153912]: Machine qemu-25-instance-0000002f terminated.
Jan 22 17:23:51 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [NOTICE]   (217654) : haproxy version is 2.8.14-c23fe91
Jan 22 17:23:51 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [NOTICE]   (217654) : path to executable is /usr/sbin/haproxy
Jan 22 17:23:51 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [WARNING]  (217654) : Exiting Master process...
Jan 22 17:23:51 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [ALERT]    (217654) : Current worker (217657) exited with code 143 (Terminated)
Jan 22 17:23:51 np0005592767 neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124[217645]: [WARNING]  (217654) : All workers exited. Exiting... (0)
Jan 22 17:23:51 np0005592767 systemd[1]: libpod-0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9.scope: Deactivated successfully.
Jan 22 17:23:51 np0005592767 podman[217833]: 2026-01-22 22:23:51.453252479 +0000 UTC m=+0.045070044 container died 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:23:51 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9-userdata-shm.mount: Deactivated successfully.
Jan 22 17:23:51 np0005592767 systemd[1]: var-lib-containers-storage-overlay-ccf60d35788b03f9279bb262ffa4268bff81a71a6075272d5e5dbb87e3da3992-merged.mount: Deactivated successfully.
Jan 22 17:23:51 np0005592767 podman[217833]: 2026-01-22 22:23:51.492110189 +0000 UTC m=+0.083927764 container cleanup 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:23:51 np0005592767 systemd[1]: libpod-conmon-0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9.scope: Deactivated successfully.
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.523 182627 INFO nova.virt.libvirt.driver [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Instance destroyed successfully.#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.523 182627 DEBUG nova.objects.instance [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lazy-loading 'resources' on Instance uuid 533d47fa-356e-452a-9c5b-734a1d5ad7eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.590 182627 DEBUG nova.virt.libvirt.vif [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:23:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=47,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMvA/CPL6TIa31HSZuHKOZSlos40RbzRicpVro4690k5p7eowVR40zEGUgSky3gqj7HWKRiQdbi7QE8X8Zam3PCiqbLUk+Xw5w+YtiEFzjWD3vAAqJ2HE7dVPPvoP6k/dQ==',key_name='tempest-keypair-321443532',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:23:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='edab01dca3144ceaaefaf47054f047c3',ramdisk_id='',reservation_id='r-fk4bm1xm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-1419442983',owner_user_name='tempest-ServersV294TestFqdnHostnames-1419442983-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:23:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eed342926fef4463ad00939574b29d92',uuid=533d47fa-356e-452a-9c5b-734a1d5ad7eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.591 182627 DEBUG nova.network.os_vif_util [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converting VIF {"id": "bee11368-9daa-4202-adae-f89264cf9f5f", "address": "fa:16:3e:5e:de:22", "network": {"id": "dc7d5d50-f986-460a-be6f-94d81f5e7124", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2076496415-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "edab01dca3144ceaaefaf47054f047c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbee11368-9d", "ovs_interfaceid": "bee11368-9daa-4202-adae-f89264cf9f5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.591 182627 DEBUG nova.network.os_vif_util [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.592 182627 DEBUG os_vif [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.594 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.594 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbee11368-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.596 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.597 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.599 182627 INFO os_vif [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:de:22,bridge_name='br-int',has_traffic_filtering=True,id=bee11368-9daa-4202-adae-f89264cf9f5f,network=Network(dc7d5d50-f986-460a-be6f-94d81f5e7124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbee11368-9d')#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.599 182627 INFO nova.virt.libvirt.driver [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Deleting instance files /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb_del#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.600 182627 INFO nova.virt.libvirt.driver [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Deletion of /var/lib/nova/instances/533d47fa-356e-452a-9c5b-734a1d5ad7eb_del complete#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.720 182627 DEBUG nova.compute.manager [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-unplugged-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.720 182627 DEBUG oslo_concurrency.lockutils [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.721 182627 DEBUG oslo_concurrency.lockutils [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.721 182627 DEBUG oslo_concurrency.lockutils [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.721 182627 DEBUG nova.compute.manager [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] No waiting events found dispatching network-vif-unplugged-bee11368-9daa-4202-adae-f89264cf9f5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.770 182627 DEBUG nova.compute.manager [req-1cc7f2ef-1755-4b5c-bd39-2f728b195739 req-b75cd460-80df-432b-aca9-0f6871d2a6b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-unplugged-bee11368-9daa-4202-adae-f89264cf9f5f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:23:51 np0005592767 podman[217875]: 2026-01-22 22:23:51.856677783 +0000 UTC m=+0.344929159 container remove 0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.866 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0b5a97-fd28-4cae-9bf6-2c81949698f8]: (4, ('Thu Jan 22 10:23:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124 (0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9)\n0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9\nThu Jan 22 10:23:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124 (0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9)\n0422510f39564e34e6d50f45d6ce6302eb11086288afe190fe68441937a081e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.868 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ce55be3b-c2f5-41a3-a18b-d18bfe137cd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.869 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc7d5d50-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.872 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 kernel: tapdc7d5d50-f0: left promiscuous mode
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.885 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.891 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ed197c39-1034-4a49-9314-c724038dfecf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf073d6-2239-43fa-9d78-03f5495928b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.914 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e1839673-3bf5-49f9-94ad-b8f1334a80f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.944 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bd416d9a-ab75-4ee8-a685-369f769207a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420587, 'reachable_time': 31681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217896, 'error': None, 'target': 'ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 systemd[1]: run-netns-ovnmeta\x2ddc7d5d50\x2df986\x2d460a\x2dbe6f\x2d94d81f5e7124.mount: Deactivated successfully.
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.950 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc7d5d50-f986-460a-be6f-94d81f5e7124 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:23:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:51.951 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3037c949-75fa-498d-a308-80bc0a205a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.957 182627 INFO nova.compute.manager [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Took 0.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.958 182627 DEBUG oslo.service.loopingcall [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.958 182627 DEBUG nova.compute.manager [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:23:51 np0005592767 nova_compute[182623]: 2026-01-22 22:23:51.958 182627 DEBUG nova.network.neutron [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.208 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.208 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.232 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.299 182627 DEBUG nova.network.neutron [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.320 182627 INFO nova.compute.manager [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Took 1.36 seconds to deallocate network for instance.#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.341 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.341 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.349 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.349 182627 INFO nova.compute.claims [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.398 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.428 182627 DEBUG nova.compute.manager [req-0f7c722d-497b-4db4-85cd-b0278a016eac req-04a4cba1-f426-45d1-8d45-7b6a91863c05 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-deleted-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.483 182627 DEBUG nova.compute.provider_tree [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.503 182627 DEBUG nova.scheduler.client.report [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.521 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.522 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.523 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.580 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.581 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.598 182627 INFO nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.628 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.636 182627 DEBUG nova.compute.provider_tree [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.656 182627 DEBUG nova.scheduler.client.report [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.686 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.723 182627 INFO nova.scheduler.client.report [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Deleted allocations for instance 533d47fa-356e-452a-9c5b-734a1d5ad7eb#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.774 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.775 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.776 182627 INFO nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Creating image(s)#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.777 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.777 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.778 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.778 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "ba85555e7564e6a234e110f556e0425220bc4643" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.779 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "ba85555e7564e6a234e110f556e0425220bc4643" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.820 182627 DEBUG oslo_concurrency.lockutils [None req-af53f083-5eea-4653-b1b4-5371c51047c5 eed342926fef4463ad00939574b29d92 edab01dca3144ceaaefaf47054f047c3 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.873 182627 DEBUG nova.policy [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52d9fe7f0e8b4edf92fa2064aaab8bca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3a2ee662fba426c8f688455b20759bf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.935 182627 DEBUG nova.compute.manager [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.935 182627 DEBUG oslo_concurrency.lockutils [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.935 182627 DEBUG oslo_concurrency.lockutils [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.936 182627 DEBUG oslo_concurrency.lockutils [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "533d47fa-356e-452a-9c5b-734a1d5ad7eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.936 182627 DEBUG nova.compute.manager [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] No waiting events found dispatching network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:23:53 np0005592767 nova_compute[182623]: 2026-01-22 22:23:53.936 182627 WARNING nova.compute.manager [req-c2e1fa88-914c-45ed-a3fd-f7c8c527d1ad req-ba19478e-5b16-459d-9ae0-411cea1e32e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Received unexpected event network-vif-plugged-bee11368-9daa-4202-adae-f89264cf9f5f for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:23:54 np0005592767 nova_compute[182623]: 2026-01-22 22:23:54.381 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120619.3793426, ce913c81-c8b7-4b71-91b0-ec941d59dc1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:54 np0005592767 nova_compute[182623]: 2026-01-22 22:23:54.382 182627 INFO nova.compute.manager [-] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:23:54 np0005592767 nova_compute[182623]: 2026-01-22 22:23:54.432 182627 DEBUG nova.compute.manager [None req-9c83813a-9bfa-4b73-b454-875b4f17f72d - - - - - -] [instance: ce913c81-c8b7-4b71-91b0-ec941d59dc1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.621 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Successfully created port: 37f85273-ea9a-4ab5-9dbf-5881f18b133a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.644 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.715 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.part --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.716 182627 DEBUG nova.virt.images [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] 67813da3-246b-4cf8-b06c-3086ab4bc987 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.717 182627 DEBUG nova.privsep.utils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.718 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.part /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.855 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.part /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.converted" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.861 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.935 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.938 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "ba85555e7564e6a234e110f556e0425220bc4643" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:55 np0005592767 nova_compute[182623]: 2026-01-22 22:23:55.952 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.019 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.020 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "ba85555e7564e6a234e110f556e0425220bc4643" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.021 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "ba85555e7564e6a234e110f556e0425220bc4643" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.039 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.092 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.093 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643,backing_fmt=raw /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.125 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643,backing_fmt=raw /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.127 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "ba85555e7564e6a234e110f556e0425220bc4643" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.127 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.194 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.196 182627 DEBUG nova.objects.instance [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'migration_context' on Instance uuid 422f993e-c0c6-47af-889b-1a4f3a052bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.210 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.211 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Ensure instance console log exists: /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.212 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.212 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.213 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.597 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.607 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Successfully updated port: 37f85273-ea9a-4ab5-9dbf-5881f18b133a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.627 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.628 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquired lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.628 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:23:56 np0005592767 nova_compute[182623]: 2026-01-22 22:23:56.836 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.140 182627 DEBUG nova.compute.manager [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-changed-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.140 182627 DEBUG nova.compute.manager [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Refreshing instance network info cache due to event network-changed-37f85273-ea9a-4ab5-9dbf-5881f18b133a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.141 182627 DEBUG oslo_concurrency.lockutils [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.594 182627 DEBUG nova.network.neutron [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Updating instance_info_cache with network_info: [{"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.617 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Releasing lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.618 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Instance network_info: |[{"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.618 182627 DEBUG oslo_concurrency.lockutils [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.619 182627 DEBUG nova.network.neutron [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Refreshing network info cache for port 37f85273-ea9a-4ab5-9dbf-5881f18b133a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.622 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Start _get_guest_xml network_info=[{"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2026-01-22T22:23:43Z,direct_url=<?>,disk_format='qcow2',id=67813da3-246b-4cf8-b06c-3086ab4bc987,min_disk=1,min_ram=0,name='tempest-test-snap-1341920130',owner='d3a2ee662fba426c8f688455b20759bf',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2026-01-22T22:23:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '67813da3-246b-4cf8-b06c-3086ab4bc987'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.627 182627 WARNING nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.631 182627 DEBUG nova.virt.libvirt.host [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.632 182627 DEBUG nova.virt.libvirt.host [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.638 182627 DEBUG nova.virt.libvirt.host [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.638 182627 DEBUG nova.virt.libvirt.host [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.640 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.640 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2026-01-22T22:23:43Z,direct_url=<?>,disk_format='qcow2',id=67813da3-246b-4cf8-b06c-3086ab4bc987,min_disk=1,min_ram=0,name='tempest-test-snap-1341920130',owner='d3a2ee662fba426c8f688455b20759bf',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2026-01-22T22:23:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.641 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.641 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.641 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.641 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.642 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.642 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.642 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.643 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.643 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.643 182627 DEBUG nova.virt.hardware [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.648 182627 DEBUG nova.virt.libvirt.vif [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-312875368',display_name='tempest-ImagesTestJSON-server-312875368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-312875368',id=50,image_ref='67813da3-246b-4cf8-b06c-3086ab4bc987',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-6vnbmvzk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='5d4456c4-888d-4a4f-b820-b7eed8f26b8b',image_min_disk='1',image_min_ram='0',image_owner_id='d3a2ee662fba426c8f688455b20759bf',image_owner_project_name='tempest-ImagesTestJSON-23148374',image_owner_user_name='tempest-ImagesTestJSON-23148374-project-member',image_user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:53Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=422f993e-c0c6-47af-889b-1a4f3a052bca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.648 182627 DEBUG nova.network.os_vif_util [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.649 182627 DEBUG nova.network.os_vif_util [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.650 182627 DEBUG nova.objects.instance [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 422f993e-c0c6-47af-889b-1a4f3a052bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.663 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <uuid>422f993e-c0c6-47af-889b-1a4f3a052bca</uuid>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <name>instance-00000032</name>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:name>tempest-ImagesTestJSON-server-312875368</nova:name>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:23:58</nova:creationTime>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:user uuid="52d9fe7f0e8b4edf92fa2064aaab8bca">tempest-ImagesTestJSON-23148374-project-member</nova:user>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:project uuid="d3a2ee662fba426c8f688455b20759bf">tempest-ImagesTestJSON-23148374</nova:project>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="67813da3-246b-4cf8-b06c-3086ab4bc987"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        <nova:port uuid="37f85273-ea9a-4ab5-9dbf-5881f18b133a">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="serial">422f993e-c0c6-47af-889b-1a4f3a052bca</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="uuid">422f993e-c0c6-47af-889b-1a4f3a052bca</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.config"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:04:58:6e"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <target dev="tap37f85273-ea"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/console.log" append="off"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <input type="keyboard" bus="usb"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:23:58 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:23:58 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:23:58 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:23:58 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.664 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Preparing to wait for external event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.664 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.665 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.665 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.666 182627 DEBUG nova.virt.libvirt.vif [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:23:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-312875368',display_name='tempest-ImagesTestJSON-server-312875368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-312875368',id=50,image_ref='67813da3-246b-4cf8-b06c-3086ab4bc987',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-6vnbmvzk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='5d4456c4-888d-4a4f-b820-b7eed8f26b8b',image_min_disk='1',image_min_ram='0',image_owner_id='d3a2ee662fba426c8f688455b20759bf',image_owner_project_name='tempest-ImagesTestJSON-23148374',image_owner_user_name='tempest-ImagesTestJSON-23148374-project-member',image_user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:23:53Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=422f993e-c0c6-47af-889b-1a4f3a052bca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.666 182627 DEBUG nova.network.os_vif_util [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.667 182627 DEBUG nova.network.os_vif_util [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.668 182627 DEBUG os_vif [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.669 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.669 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.670 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.672 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.672 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37f85273-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.673 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37f85273-ea, col_values=(('external_ids', {'iface-id': '37f85273-ea9a-4ab5-9dbf-5881f18b133a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:58:6e', 'vm-uuid': '422f993e-c0c6-47af-889b-1a4f3a052bca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.674 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:58 np0005592767 NetworkManager[54973]: <info>  [1769120638.6756] manager: (tap37f85273-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.677 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.679 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.680 182627 INFO os_vif [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea')#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.751 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.752 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.752 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] No VIF found with MAC fa:16:3e:04:58:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.752 182627 INFO nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Using config drive#033[00m
Jan 22 17:23:58 np0005592767 podman[217924]: 2026-01-22 22:23:58.759944182 +0000 UTC m=+0.050519906 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:58 np0005592767 nova_compute[182623]: 2026-01-22 22:23:58.980 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.198 182627 INFO nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Creating config drive at /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.config#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.204 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzhsf8oa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.337 182627 DEBUG oslo_concurrency.processutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzhsf8oa" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:23:59 np0005592767 kernel: tap37f85273-ea: entered promiscuous mode
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.3906] manager: (tap37f85273-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 22 17:23:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:59Z|00171|binding|INFO|Claiming lport 37f85273-ea9a-4ab5-9dbf-5881f18b133a for this chassis.
Jan 22 17:23:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:59Z|00172|binding|INFO|37f85273-ea9a-4ab5-9dbf-5881f18b133a: Claiming fa:16:3e:04:58:6e 10.100.0.13
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.390 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.392 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.399 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:58:6e 10.100.0.13'], port_security=['fa:16:3e:04:58:6e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '422f993e-c0c6-47af-889b-1a4f3a052bca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=37f85273-ea9a-4ab5-9dbf-5881f18b133a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.400 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 37f85273-ea9a-4ab5-9dbf-5881f18b133a in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 bound to our chassis#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.401 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd5f6392-bfb2-42bf-a825-c0516c8891b0#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.411 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cd968798-84e3-4a83-ad2f-de27c1e15f19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.412 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd5f6392-b1 in ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.414 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd5f6392-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.414 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9211c0-4420-4e22-9b77-c72017d2d2eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 systemd-udevd[217960]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.415 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbb089e-7882-4660-800b-9a0627f2551d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.4247] device (tap37f85273-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.4254] device (tap37f85273-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.426 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[9803eb32-f727-4428-a521-f1724f93e5fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 systemd-machined[153912]: New machine qemu-26-instance-00000032.
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.463 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:59Z|00173|binding|INFO|Setting lport 37f85273-ea9a-4ab5-9dbf-5881f18b133a ovn-installed in OVS
Jan 22 17:23:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:59Z|00174|binding|INFO|Setting lport 37f85273-ea9a-4ab5-9dbf-5881f18b133a up in Southbound
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.469 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.469 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dae79a1c-a055-47a2-a279-74c378b34300]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 systemd[1]: Started Virtual Machine qemu-26-instance-00000032.
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.498 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f18c4fc4-8990-4fb8-967f-51f252834d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.503 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[01bbef34-0186-4cfd-b287-f1dfd19d386e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.5043] manager: (tapdd5f6392-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.531 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d671919d-1e48-433a-93ce-e42af1bb337c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.534 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0649d414-342e-49de-9c3e-297edec8ef8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.5524] device (tapdd5f6392-b0): carrier: link connected
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.556 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[036f3d85-637b-4474-a920-76bb417b9355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.570 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8da42125-a93d-49f2-a972-f609b3348ec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423613, 'reachable_time': 32424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217995, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.582 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03e85add-edaf-4152-827f-482f04938bf4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:d723'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423613, 'tstamp': 423613}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217996, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.596 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3788e2ba-a22a-4e8e-a3ff-921354c4f157]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd5f6392-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:d7:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423613, 'reachable_time': 32424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217997, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.622 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6efff93c-ba4b-4dc1-b95f-b228f130dade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.678 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1da45769-2f2e-4b90-8216-4126090faef9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.679 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.679 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.680 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd5f6392-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:59 np0005592767 kernel: tapdd5f6392-b0: entered promiscuous mode
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.681 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 NetworkManager[54973]: <info>  [1769120639.6822] manager: (tapdd5f6392-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.684 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd5f6392-b0, col_values=(('external_ids', {'iface-id': 'c2b5e191-6c34-4707-83d4-b3c5bc12ff1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.684 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:23:59Z|00175|binding|INFO|Releasing lport c2b5e191-6c34-4707-83d4-b3c5bc12ff1e from this chassis (sb_readonly=0)
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.687 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.688 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.688 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bccb2935-6f99-4168-96bc-c0391e87af1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.689 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/dd5f6392-bfb2-42bf-a825-c0516c8891b0.pid.haproxy
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID dd5f6392-bfb2-42bf-a825-c0516c8891b0
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:23:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:23:59.689 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'env', 'PROCESS_TAG=haproxy-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd5f6392-bfb2-42bf-a825-c0516c8891b0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.697 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.711 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120639.7107596, 422f993e-c0c6-47af-889b-1a4f3a052bca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.711 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] VM Started (Lifecycle Event)#033[00m
Jan 22 17:23:59 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:23:59 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.739 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.741 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120639.7109857, 422f993e-c0c6-47af-889b-1a4f3a052bca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.742 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.758 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.761 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:23:59 np0005592767 nova_compute[182623]: 2026-01-22 22:23:59.785 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:24:00 np0005592767 podman[218037]: 2026-01-22 22:24:00.079742516 +0000 UTC m=+0.054852295 container create a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:24:00 np0005592767 nova_compute[182623]: 2026-01-22 22:24:00.098 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:00 np0005592767 systemd[1]: Started libpod-conmon-a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd.scope.
Jan 22 17:24:00 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:24:00 np0005592767 podman[218037]: 2026-01-22 22:24:00.049811044 +0000 UTC m=+0.024920853 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:24:00 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef346085b327a3d48e1ffefda7160577953285881318e902453bd21f281b45a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:24:00 np0005592767 podman[218037]: 2026-01-22 22:24:00.160397398 +0000 UTC m=+0.135507207 container init a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:24:00 np0005592767 podman[218037]: 2026-01-22 22:24:00.164968705 +0000 UTC m=+0.140078484 container start a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:24:00 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [NOTICE]   (218056) : New worker (218058) forked
Jan 22 17:24:00 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [NOTICE]   (218056) : Loading success.
Jan 22 17:24:00 np0005592767 nova_compute[182623]: 2026-01-22 22:24:00.936 182627 DEBUG nova.network.neutron [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Updated VIF entry in instance network info cache for port 37f85273-ea9a-4ab5-9dbf-5881f18b133a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:00 np0005592767 nova_compute[182623]: 2026-01-22 22:24:00.937 182627 DEBUG nova.network.neutron [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Updating instance_info_cache with network_info: [{"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:00 np0005592767 nova_compute[182623]: 2026-01-22 22:24:00.972 182627 DEBUG oslo_concurrency.lockutils [req-856899e9-13ac-47cf-8fcf-5bee9410fbe9 req-0108a267-7543-4b3a-bb50-c2c9112defd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-422f993e-c0c6-47af-889b-1a4f3a052bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.067 182627 DEBUG nova.compute.manager [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.067 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.068 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.068 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.069 182627 DEBUG nova.compute.manager [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Processing event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.069 182627 DEBUG nova.compute.manager [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.069 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.070 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.070 182627 DEBUG oslo_concurrency.lockutils [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.071 182627 DEBUG nova.compute.manager [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] No waiting events found dispatching network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.071 182627 WARNING nova.compute.manager [req-07486ad2-916d-452d-a760-89e27c6ee71c req-eef26a5a-7e06-4227-baea-395ad24b86b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received unexpected event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.072 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.076 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120641.0763388, 422f993e-c0c6-47af-889b-1a4f3a052bca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.076 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.080 182627 DEBUG nova.virt.libvirt.driver [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.085 182627 INFO nova.virt.libvirt.driver [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Instance spawned successfully.#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.086 182627 INFO nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Took 7.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.086 182627 DEBUG nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.121 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.125 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.157 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.195 182627 INFO nova.compute.manager [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Took 7.89 seconds to build instance.#033[00m
Jan 22 17:24:01 np0005592767 nova_compute[182623]: 2026-01-22 22:24:01.217 182627 DEBUG oslo_concurrency.lockutils [None req-343a9500-c26a-4c3f-8f1a-377542e40546 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:03 np0005592767 podman[218069]: 2026-01-22 22:24:03.14478974 +0000 UTC m=+0.060954506 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:24:03 np0005592767 podman[218068]: 2026-01-22 22:24:03.167277835 +0000 UTC m=+0.088372188 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.676 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.779 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.780 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.780 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.781 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.781 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.792 182627 INFO nova.compute.manager [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Terminating instance#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.803 182627 DEBUG nova.compute.manager [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:24:03 np0005592767 kernel: tap37f85273-ea (unregistering): left promiscuous mode
Jan 22 17:24:03 np0005592767 NetworkManager[54973]: <info>  [1769120643.8216] device (tap37f85273-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.829 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:03Z|00176|binding|INFO|Releasing lport 37f85273-ea9a-4ab5-9dbf-5881f18b133a from this chassis (sb_readonly=0)
Jan 22 17:24:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:03Z|00177|binding|INFO|Setting lport 37f85273-ea9a-4ab5-9dbf-5881f18b133a down in Southbound
Jan 22 17:24:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:03Z|00178|binding|INFO|Removing iface tap37f85273-ea ovn-installed in OVS
Jan 22 17:24:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:03.840 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:58:6e 10.100.0.13'], port_security=['fa:16:3e:04:58:6e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '422f993e-c0c6-47af-889b-1a4f3a052bca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3a2ee662fba426c8f688455b20759bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1948254d-7c65-4b2f-a3b0-945b1c0d9215', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0221eb3-fb7d-4931-b902-8b58313a674d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=37f85273-ea9a-4ab5-9dbf-5881f18b133a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:03.842 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 37f85273-ea9a-4ab5-9dbf-5881f18b133a in datapath dd5f6392-bfb2-42bf-a825-c0516c8891b0 unbound from our chassis#033[00m
Jan 22 17:24:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:03.845 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd5f6392-bfb2-42bf-a825-c0516c8891b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:24:03 np0005592767 nova_compute[182623]: 2026-01-22 22:24:03.846 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:03.846 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[913e67f3-f943-4025-98b9-1b4cfb740ed5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:03.847 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 namespace which is not needed anymore#033[00m
Jan 22 17:24:03 np0005592767 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 22 17:24:03 np0005592767 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000032.scope: Consumed 3.106s CPU time.
Jan 22 17:24:03 np0005592767 systemd-machined[153912]: Machine qemu-26-instance-00000032 terminated.
Jan 22 17:24:03 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [NOTICE]   (218056) : haproxy version is 2.8.14-c23fe91
Jan 22 17:24:03 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [NOTICE]   (218056) : path to executable is /usr/sbin/haproxy
Jan 22 17:24:03 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [WARNING]  (218056) : Exiting Master process...
Jan 22 17:24:03 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [ALERT]    (218056) : Current worker (218058) exited with code 143 (Terminated)
Jan 22 17:24:03 np0005592767 neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0[218052]: [WARNING]  (218056) : All workers exited. Exiting... (0)
Jan 22 17:24:03 np0005592767 systemd[1]: libpod-a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd.scope: Deactivated successfully.
Jan 22 17:24:03 np0005592767 podman[218137]: 2026-01-22 22:24:03.96185004 +0000 UTC m=+0.039997422 container died a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:24:03 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd-userdata-shm.mount: Deactivated successfully.
Jan 22 17:24:03 np0005592767 systemd[1]: var-lib-containers-storage-overlay-ef346085b327a3d48e1ffefda7160577953285881318e902453bd21f281b45a6-merged.mount: Deactivated successfully.
Jan 22 17:24:03 np0005592767 podman[218137]: 2026-01-22 22:24:03.996920755 +0000 UTC m=+0.075068127 container cleanup a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:24:04 np0005592767 systemd[1]: libpod-conmon-a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd.scope: Deactivated successfully.
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 podman[218169]: 2026-01-22 22:24:04.050089223 +0000 UTC m=+0.036678081 container remove a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.058 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7749789d-6cd6-468f-893b-95387349ed9b]: (4, ('Thu Jan 22 10:24:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd)\na1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd\nThu Jan 22 10:24:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 (a1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd)\na1a80a8ff980b5c8d8c3236305804f33a3d6fd452087fc4da08bdb6db7abcafd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[09f541ac-80dd-464b-8932-c36f287082ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.061 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd5f6392-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 kernel: tapdd5f6392-b0: left promiscuous mode
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.067 182627 INFO nova.virt.libvirt.driver [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Instance destroyed successfully.#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.067 182627 DEBUG nova.objects.instance [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lazy-loading 'resources' on Instance uuid 422f993e-c0c6-47af-889b-1a4f3a052bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.082 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.085 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4c197a35-06f0-445b-ba79-3982a78579db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.088 182627 DEBUG nova.virt.libvirt.vif [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:23:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-312875368',display_name='tempest-ImagesTestJSON-server-312875368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-312875368',id=50,image_ref='67813da3-246b-4cf8-b06c-3086ab4bc987',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d3a2ee662fba426c8f688455b20759bf',ramdisk_id='',reservation_id='r-6vnbmvzk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='5d4456c4-888d-4a4f-b820-b7eed8f26b8b',image_min_disk='1',image_min_ram='0',image_owner_id='d3a2ee662fba426c8f688455b20759bf',image_owner_project_name='tempest-ImagesTestJSON-23148374',image_owner_user_name='tempest-ImagesTestJSON-23148374-project-member',image_user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',owner_project_name='tempest-ImagesTestJSON-23148374',owner_user_name='tempest-ImagesTestJSON-23148374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:01Z,user_data=None,user_id='52d9fe7f0e8b4edf92fa2064aaab8bca',uuid=422f993e-c0c6-47af-889b-1a4f3a052bca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.089 182627 DEBUG nova.network.os_vif_util [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converting VIF {"id": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "address": "fa:16:3e:04:58:6e", "network": {"id": "dd5f6392-bfb2-42bf-a825-c0516c8891b0", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1994090273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3a2ee662fba426c8f688455b20759bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37f85273-ea", "ovs_interfaceid": "37f85273-ea9a-4ab5-9dbf-5881f18b133a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.090 182627 DEBUG nova.network.os_vif_util [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.091 182627 DEBUG os_vif [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.093 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.093 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37f85273-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.095 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.098 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.101 182627 INFO os_vif [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:58:6e,bridge_name='br-int',has_traffic_filtering=True,id=37f85273-ea9a-4ab5-9dbf-5881f18b133a,network=Network(dd5f6392-bfb2-42bf-a825-c0516c8891b0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37f85273-ea')#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.101 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[95a8d8de-4c08-4c21-866e-2f2bc34de458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.102 182627 INFO nova.virt.libvirt.driver [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Deleting instance files /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca_del#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.102 182627 INFO nova.virt.libvirt.driver [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Deletion of /var/lib/nova/instances/422f993e-c0c6-47af-889b-1a4f3a052bca_del complete#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.102 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[45e2852c-1cae-425a-b13e-ac61700aa615]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.117 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d6121f61-030c-4f43-be04-190eb46b2e20]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423607, 'reachable_time': 35990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218203, 'error': None, 'target': 'ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.119 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd5f6392-bfb2-42bf-a825-c0516c8891b0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:24:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:04.119 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c418aedd-7d7d-42e2-aa8d-fb61d62b0e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:04 np0005592767 systemd[1]: run-netns-ovnmeta\x2ddd5f6392\x2dbfb2\x2d42bf\x2da825\x2dc0516c8891b0.mount: Deactivated successfully.
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.181 182627 INFO nova.compute.manager [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.181 182627 DEBUG oslo.service.loopingcall [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.182 182627 DEBUG nova.compute.manager [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.182 182627 DEBUG nova.network.neutron [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.211 182627 DEBUG nova.compute.manager [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-unplugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.212 182627 DEBUG oslo_concurrency.lockutils [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.212 182627 DEBUG oslo_concurrency.lockutils [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.213 182627 DEBUG oslo_concurrency.lockutils [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.213 182627 DEBUG nova.compute.manager [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] No waiting events found dispatching network-vif-unplugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:04 np0005592767 nova_compute[182623]: 2026-01-22 22:24:04.213 182627 DEBUG nova.compute.manager [req-fd3a2200-03a1-4404-8ced-8a9de6d5c9e7 req-242ec17b-8752-41ea-ac4f-291d63f603da 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-unplugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.101 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.215 182627 DEBUG nova.network.neutron [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.240 182627 INFO nova.compute.manager [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Took 1.06 seconds to deallocate network for instance.#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.312 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.313 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.345 182627 DEBUG nova.compute.manager [req-02f8bfd6-ed7d-4c67-a2cd-94383e1a30f2 req-127b310f-abd4-4dba-a841-553fb4fc8c4f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-deleted-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.406 182627 DEBUG nova.compute.provider_tree [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.424 182627 DEBUG nova.scheduler.client.report [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.477 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.548 182627 INFO nova.scheduler.client.report [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Deleted allocations for instance 422f993e-c0c6-47af-889b-1a4f3a052bca#033[00m
Jan 22 17:24:05 np0005592767 nova_compute[182623]: 2026-01-22 22:24:05.641 182627 DEBUG oslo_concurrency.lockutils [None req-9e773941-18da-485f-a189-7006ca3f017e 52d9fe7f0e8b4edf92fa2064aaab8bca d3a2ee662fba426c8f688455b20759bf - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.396 182627 DEBUG nova.compute.manager [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.396 182627 DEBUG oslo_concurrency.lockutils [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.396 182627 DEBUG oslo_concurrency.lockutils [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.397 182627 DEBUG oslo_concurrency.lockutils [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "422f993e-c0c6-47af-889b-1a4f3a052bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.397 182627 DEBUG nova.compute.manager [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] No waiting events found dispatching network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.397 182627 WARNING nova.compute.manager [req-87fe33a6-71fb-4d2b-9b52-0574acffb601 req-666dd0bf-42a5-49ea-91cf-093081b1fdff 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Received unexpected event network-vif-plugged-37f85273-ea9a-4ab5-9dbf-5881f18b133a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.522 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120631.5211277, 533d47fa-356e-452a-9c5b-734a1d5ad7eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.523 182627 INFO nova.compute.manager [-] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:24:06 np0005592767 nova_compute[182623]: 2026-01-22 22:24:06.545 182627 DEBUG nova.compute.manager [None req-1f6dbff2-b20e-4136-833b-251a6fa50420 - - - - - -] [instance: 533d47fa-356e-452a-9c5b-734a1d5ad7eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:24:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:24:07 np0005592767 nova_compute[182623]: 2026-01-22 22:24:07.953 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:07 np0005592767 nova_compute[182623]: 2026-01-22 22:24:07.953 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:07 np0005592767 nova_compute[182623]: 2026-01-22 22:24:07.975 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.102 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.103 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.117 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.118 182627 INFO nova.compute.claims [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.222 182627 DEBUG nova.compute.provider_tree [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.242 182627 DEBUG nova.scheduler.client.report [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.267 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.267 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.330 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.330 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.365 182627 INFO nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.382 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.496 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.498 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.499 182627 INFO nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Creating image(s)#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.500 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.500 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.502 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.526 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.594 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.595 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.596 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.610 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.660 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.661 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.699 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.700 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.701 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.756 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.757 182627 DEBUG nova.virt.disk.api [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Checking if we can resize image /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.757 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.776 182627 DEBUG nova.policy [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee60032f6e6844d2b5f32dec17c83e5b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d4efed580924587923a2cc36dca176f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.830 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.831 182627 DEBUG nova.virt.disk.api [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Cannot resize image /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.831 182627 DEBUG nova.objects.instance [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'migration_context' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.842 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.843 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Ensure instance console log exists: /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.844 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.845 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:08 np0005592767 nova_compute[182623]: 2026-01-22 22:24:08.845 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:09 np0005592767 nova_compute[182623]: 2026-01-22 22:24:09.096 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:09 np0005592767 nova_compute[182623]: 2026-01-22 22:24:09.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:09 np0005592767 nova_compute[182623]: 2026-01-22 22:24:09.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:09 np0005592767 nova_compute[182623]: 2026-01-22 22:24:09.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:09 np0005592767 nova_compute[182623]: 2026-01-22 22:24:09.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:24:10 np0005592767 nova_compute[182623]: 2026-01-22 22:24:10.070 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Successfully created port: 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:24:10 np0005592767 nova_compute[182623]: 2026-01-22 22:24:10.136 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:10 np0005592767 podman[218219]: 2026-01-22 22:24:10.187059221 +0000 UTC m=+0.095150566 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 17:24:10 np0005592767 podman[218220]: 2026-01-22 22:24:10.187557865 +0000 UTC m=+0.092613715 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:24:10 np0005592767 nova_compute[182623]: 2026-01-22 22:24:10.944 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:10 np0005592767 nova_compute[182623]: 2026-01-22 22:24:10.945 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.653 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Successfully updated port: 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.671 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.672 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.672 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.872 182627 DEBUG nova.compute.manager [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.872 182627 DEBUG nova.compute.manager [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing instance network info cache due to event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.873 182627 DEBUG oslo_concurrency.lockutils [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.921 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.921 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:11 np0005592767 nova_compute[182623]: 2026-01-22 22:24:11.922 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:12.095 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:12.096 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:12.096 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:12 np0005592767 nova_compute[182623]: 2026-01-22 22:24:12.192 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:24:12 np0005592767 nova_compute[182623]: 2026-01-22 22:24:12.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.322 182627 DEBUG nova.network.neutron [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.344 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.345 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Instance network_info: |[{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.345 182627 DEBUG oslo_concurrency.lockutils [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.345 182627 DEBUG nova.network.neutron [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.348 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Start _get_guest_xml network_info=[{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.353 182627 WARNING nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.358 182627 DEBUG nova.virt.libvirt.host [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.359 182627 DEBUG nova.virt.libvirt.host [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.362 182627 DEBUG nova.virt.libvirt.host [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.363 182627 DEBUG nova.virt.libvirt.host [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.364 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.364 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.365 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.365 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.365 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.366 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.366 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.366 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.367 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.367 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.367 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.367 182627 DEBUG nova.virt.hardware [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.371 182627 DEBUG nova.virt.libvirt.vif [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.372 182627 DEBUG nova.network.os_vif_util [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.372 182627 DEBUG nova.network.os_vif_util [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.374 182627 DEBUG nova.objects.instance [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'pci_devices' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.395 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <uuid>a8a23884-c76d-4690-a418-d67ad5bd459c</uuid>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <name>instance-00000033</name>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:name>tempest-tempest.common.compute-instance-1726954682</nova:name>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:24:13</nova:creationTime>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:user uuid="ee60032f6e6844d2b5f32dec17c83e5b">tempest-AttachInterfacesTestJSON-1827119457-project-member</nova:user>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:project uuid="4d4efed580924587923a2cc36dca176f">tempest-AttachInterfacesTestJSON-1827119457</nova:project>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        <nova:port uuid="3162b3ad-4f6d-4a7d-9f3b-a752bedd5395">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="serial">a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="uuid">a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:87:37:a6"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <target dev="tap3162b3ad-4f"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log" append="off"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:24:13 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:24:13 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:24:13 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:24:13 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.396 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Preparing to wait for external event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.396 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.397 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.397 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.397 182627 DEBUG nova.virt.libvirt.vif [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:24:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.398 182627 DEBUG nova.network.os_vif_util [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.398 182627 DEBUG nova.network.os_vif_util [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.399 182627 DEBUG os_vif [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.399 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.399 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.400 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.402 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.402 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3162b3ad-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.402 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3162b3ad-4f, col_values=(('external_ids', {'iface-id': '3162b3ad-4f6d-4a7d-9f3b-a752bedd5395', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:37:a6', 'vm-uuid': 'a8a23884-c76d-4690-a418-d67ad5bd459c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.404 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:13 np0005592767 NetworkManager[54973]: <info>  [1769120653.4052] manager: (tap3162b3ad-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.407 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.412 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.413 182627 INFO os_vif [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f')#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.469 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.470 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.470 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No VIF found with MAC fa:16:3e:87:37:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.470 182627 INFO nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Using config drive#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.882 182627 INFO nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Creating config drive at /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.889 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt5rvslm3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:13 np0005592767 nova_compute[182623]: 2026-01-22 22:24:13.909 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.019 182627 DEBUG oslo_concurrency.processutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt5rvslm3" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:14 np0005592767 kernel: tap3162b3ad-4f: entered promiscuous mode
Jan 22 17:24:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:14Z|00179|binding|INFO|Claiming lport 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 for this chassis.
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:14Z|00180|binding|INFO|3162b3ad-4f6d-4a7d-9f3b-a752bedd5395: Claiming fa:16:3e:87:37:a6 10.100.0.3
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.1023] manager: (tap3162b3ad-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.110 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.124 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.1259] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.1268] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.131 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:37:a6 10.100.0.3'], port_security=['fa:16:3e:87:37:a6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a8a23884-c76d-4690-a418-d67ad5bd459c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-500a5e98-86da-4709-b96b-eead8466e9b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4efed580924587923a2cc36dca176f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '928ee6fb-b498-4b31-9898-b1b27063fabb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93e1ed03-3d53-4a6c-95c4-ab8774c4d51c, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.132 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 in datapath 500a5e98-86da-4709-b96b-eead8466e9b7 bound to our chassis#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.134 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 500a5e98-86da-4709-b96b-eead8466e9b7#033[00m
Jan 22 17:24:14 np0005592767 systemd-machined[153912]: New machine qemu-27-instance-00000033.
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.150 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad9f4d4-1d4a-4670-abd9-2ff617201fcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.151 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap500a5e98-81 in ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.154 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap500a5e98-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.154 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d2de233c-c5f2-4dcc-9af2-8df61d18c203]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.155 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2a703013-d34e-4651-ba7a-87cd4b6303f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 systemd[1]: Started Virtual Machine qemu-27-instance-00000033.
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.166 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f4c8b1-4dba-4686-b985-b2c1c5f9cf48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 systemd-udevd[218287]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.1931] device (tap3162b3ad-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.1944] device (tap3162b3ad-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.196 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c4644d-58ed-46a3-8544-c7c238beeb0d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.222 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.228 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c66fe68a-626a-4ab9-9c32-1ae2d4f8c8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.233 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[46cc46c3-7da1-446e-a3e9-8c05cdce4056]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.2373] manager: (tap500a5e98-80): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Jan 22 17:24:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:14Z|00181|binding|INFO|Setting lport 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 ovn-installed in OVS
Jan 22 17:24:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:14Z|00182|binding|INFO|Setting lport 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 up in Southbound
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.244 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.248 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.272 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[41d2e129-60d7-4319-8345-bf6e10d95d1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.276 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c5770755-b5b4-4a12-80f8-8f866a4501cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.3021] device (tap500a5e98-80): carrier: link connected
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.311 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3f78a80b-11c4-4899-95ce-7ed252ab03e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.330 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4404a679-2d9c-4dba-890e-37a055f6495e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap500a5e98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:85:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425088, 'reachable_time': 25429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218317, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.347 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9f08f7-32b7-4e9d-93e8-3c0fc1e349db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:852f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425088, 'tstamp': 425088}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218318, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.363 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[039dfa9a-f1ba-4edc-9d64-5d0b1440d47d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap500a5e98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:85:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425088, 'reachable_time': 25429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218319, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.391 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[02076d1a-c45a-4ae3-bf0a-2b1a25e588c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.439 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[694f2855-c4aa-47da-b3e8-8e9523ffc97a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.441 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap500a5e98-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.441 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.441 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap500a5e98-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:14 np0005592767 kernel: tap500a5e98-80: entered promiscuous mode
Jan 22 17:24:14 np0005592767 NetworkManager[54973]: <info>  [1769120654.4437] manager: (tap500a5e98-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.446 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap500a5e98-80, col_values=(('external_ids', {'iface-id': 'f0c70c85-5528-4622-8533-7199b27961a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:14Z|00183|binding|INFO|Releasing lport f0c70c85-5528-4622-8533-7199b27961a2 from this chassis (sb_readonly=0)
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.450 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/500a5e98-86da-4709-b96b-eead8466e9b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/500a5e98-86da-4709-b96b-eead8466e9b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.450 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[16fbd837-16e9-490f-a310-62a1bf6c3ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.451 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-500a5e98-86da-4709-b96b-eead8466e9b7
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/500a5e98-86da-4709-b96b-eead8466e9b7.pid.haproxy
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 500a5e98-86da-4709-b96b-eead8466e9b7
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:24:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:14.453 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'env', 'PROCESS_TAG=haproxy-500a5e98-86da-4709-b96b-eead8466e9b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/500a5e98-86da-4709-b96b-eead8466e9b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.459 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.647 182627 DEBUG nova.compute.manager [req-2ab3b504-6076-4ad4-a52d-b117773e7513 req-c81a4c59-ac8b-454c-a142-0a7ecc5d22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.654 182627 DEBUG oslo_concurrency.lockutils [req-2ab3b504-6076-4ad4-a52d-b117773e7513 req-c81a4c59-ac8b-454c-a142-0a7ecc5d22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.655 182627 DEBUG oslo_concurrency.lockutils [req-2ab3b504-6076-4ad4-a52d-b117773e7513 req-c81a4c59-ac8b-454c-a142-0a7ecc5d22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.655 182627 DEBUG oslo_concurrency.lockutils [req-2ab3b504-6076-4ad4-a52d-b117773e7513 req-c81a4c59-ac8b-454c-a142-0a7ecc5d22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.655 182627 DEBUG nova.compute.manager [req-2ab3b504-6076-4ad4-a52d-b117773e7513 req-c81a4c59-ac8b-454c-a142-0a7ecc5d22e0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Processing event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:24:14 np0005592767 podman[218352]: 2026-01-22 22:24:14.801683236 +0000 UTC m=+0.060187294 container create e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:24:14 np0005592767 systemd[1]: Started libpod-conmon-e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5.scope.
Jan 22 17:24:14 np0005592767 podman[218352]: 2026-01-22 22:24:14.763081193 +0000 UTC m=+0.021585291 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:24:14 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:24:14 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abd6d8819917eb304fb27d03e7a330ea59196d8d86bce626e0b6fdb12da29565/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:24:14 np0005592767 nova_compute[182623]: 2026-01-22 22:24:14.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:14 np0005592767 podman[218352]: 2026-01-22 22:24:14.941377729 +0000 UTC m=+0.199881807 container init e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:24:14 np0005592767 podman[218352]: 2026-01-22 22:24:14.951784489 +0000 UTC m=+0.210288547 container start e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:24:14 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [NOTICE]   (218372) : New worker (218374) forked
Jan 22 17:24:14 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [NOTICE]   (218372) : Loading success.
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.010 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.011 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.011 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.011 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.085 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.086 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120655.0848715, a8a23884-c76d-4690-a418-d67ad5bd459c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.086 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.090 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.093 182627 INFO nova.virt.libvirt.driver [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Instance spawned successfully.#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.093 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.141 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.144 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.153 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.154 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.155 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.155 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.156 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.156 182627 DEBUG nova.virt.libvirt.driver [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.163 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.168 182627 DEBUG nova.network.neutron [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updated VIF entry in instance network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.168 182627 DEBUG nova.network.neutron [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.205 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.226 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.227 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120655.0865808, a8a23884-c76d-4690-a418-d67ad5bd459c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.227 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.229 182627 DEBUG oslo_concurrency.lockutils [req-526d0fea-dbb2-4607-bdfa-6f389a1f2417 req-2fe0aae9-0b7f-4f36-9968-07f85549ad68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.265 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.265 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.317 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.341 182627 INFO nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Took 6.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.342 182627 DEBUG nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.344 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.356 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120655.0884242, a8a23884-c76d-4690-a418-d67ad5bd459c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.357 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.386 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.398 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.453 182627 INFO nova.compute.manager [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Took 7.39 seconds to build instance.#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.474 182627 DEBUG oslo_concurrency.lockutils [None req-ad09fa6f-66d6-4553-9e74-c244e6b977f5 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.550 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.550 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5624MB free_disk=73.23500442504883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.550 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.551 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.840 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance a8a23884-c76d-4690-a418-d67ad5bd459c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.841 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:24:15 np0005592767 nova_compute[182623]: 2026-01-22 22:24:15.841 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.077 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.311 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.339 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.339 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.340 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.340 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.356 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.760 182627 DEBUG nova.compute.manager [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.763 182627 DEBUG oslo_concurrency.lockutils [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.763 182627 DEBUG oslo_concurrency.lockutils [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.764 182627 DEBUG oslo_concurrency.lockutils [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.764 182627 DEBUG nova.compute.manager [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:16 np0005592767 nova_compute[182623]: 2026-01-22 22:24:16.764 182627 WARNING nova.compute.manager [req-ec8b47c9-42bb-4dd7-8737-0c98aebf0e14 req-7303c541-d625-443e-bfb7-f8694b70ddb8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:24:17 np0005592767 podman[218397]: 2026-01-22 22:24:17.16304038 +0000 UTC m=+0.074651916 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:24:17 np0005592767 nova_compute[182623]: 2026-01-22 22:24:17.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:24:18 np0005592767 nova_compute[182623]: 2026-01-22 22:24:18.405 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:19 np0005592767 nova_compute[182623]: 2026-01-22 22:24:19.063 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120644.0621204, 422f993e-c0c6-47af-889b-1a4f3a052bca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:24:19 np0005592767 nova_compute[182623]: 2026-01-22 22:24:19.064 182627 INFO nova.compute.manager [-] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:24:19 np0005592767 nova_compute[182623]: 2026-01-22 22:24:19.091 182627 DEBUG nova.compute.manager [None req-783f39e3-b8bc-40eb-a361-e492139d8f44 - - - - - -] [instance: 422f993e-c0c6-47af-889b-1a4f3a052bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:24:20 np0005592767 nova_compute[182623]: 2026-01-22 22:24:20.141 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:22 np0005592767 nova_compute[182623]: 2026-01-22 22:24:22.136 182627 DEBUG nova.compute.manager [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:22 np0005592767 nova_compute[182623]: 2026-01-22 22:24:22.137 182627 DEBUG nova.compute.manager [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing instance network info cache due to event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:24:22 np0005592767 nova_compute[182623]: 2026-01-22 22:24:22.137 182627 DEBUG oslo_concurrency.lockutils [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:22 np0005592767 nova_compute[182623]: 2026-01-22 22:24:22.137 182627 DEBUG oslo_concurrency.lockutils [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:22 np0005592767 nova_compute[182623]: 2026-01-22 22:24:22.137 182627 DEBUG nova.network.neutron [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:24:23 np0005592767 nova_compute[182623]: 2026-01-22 22:24:23.408 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.262 182627 DEBUG nova.network.neutron [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updated VIF entry in instance network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.263 182627 DEBUG nova.network.neutron [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.321 182627 DEBUG oslo_concurrency.lockutils [req-aa4e3170-fafd-4cf7-bee1-74e82df0d701 req-e9dffb56-c6ca-4c74-880e-3a20db60b076 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.325 182627 DEBUG nova.compute.manager [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.326 182627 DEBUG nova.compute.manager [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing instance network info cache due to event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.326 182627 DEBUG oslo_concurrency.lockutils [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.326 182627 DEBUG oslo_concurrency.lockutils [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:24 np0005592767 nova_compute[182623]: 2026-01-22 22:24:24.327 182627 DEBUG nova.network.neutron [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:24:25 np0005592767 nova_compute[182623]: 2026-01-22 22:24:25.143 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:25 np0005592767 nova_compute[182623]: 2026-01-22 22:24:25.760 182627 DEBUG nova.network.neutron [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updated VIF entry in instance network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:25 np0005592767 nova_compute[182623]: 2026-01-22 22:24:25.761 182627 DEBUG nova.network.neutron [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:25 np0005592767 nova_compute[182623]: 2026-01-22 22:24:25.799 182627 DEBUG oslo_concurrency.lockutils [req-736e58d7-a551-4994-82b8-c54db7358af4 req-0e291313-54e8-490c-94a0-5f2a35369e3b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:26 np0005592767 nova_compute[182623]: 2026-01-22 22:24:26.353 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:27 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:27Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:37:a6 10.100.0.3
Jan 22 17:24:27 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:27Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:37:a6 10.100.0.3
Jan 22 17:24:28 np0005592767 nova_compute[182623]: 2026-01-22 22:24:28.411 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:29 np0005592767 podman[218434]: 2026-01-22 22:24:29.21573865 +0000 UTC m=+0.112934951 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:24:30 np0005592767 nova_compute[182623]: 2026-01-22 22:24:30.145 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:33 np0005592767 nova_compute[182623]: 2026-01-22 22:24:33.273 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:33 np0005592767 nova_compute[182623]: 2026-01-22 22:24:33.413 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:34 np0005592767 podman[218456]: 2026-01-22 22:24:34.165083945 +0000 UTC m=+0.068033432 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Jan 22 17:24:34 np0005592767 podman[218455]: 2026-01-22 22:24:34.180897925 +0000 UTC m=+0.083851422 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 22 17:24:35 np0005592767 nova_compute[182623]: 2026-01-22 22:24:35.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:38 np0005592767 nova_compute[182623]: 2026-01-22 22:24:38.416 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:40 np0005592767 nova_compute[182623]: 2026-01-22 22:24:40.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:41Z|00184|binding|INFO|Releasing lport f0c70c85-5528-4622-8533-7199b27961a2 from this chassis (sb_readonly=0)
Jan 22 17:24:41 np0005592767 nova_compute[182623]: 2026-01-22 22:24:41.128 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:41 np0005592767 podman[218502]: 2026-01-22 22:24:41.13662502 +0000 UTC m=+0.055326118 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:24:41 np0005592767 podman[218503]: 2026-01-22 22:24:41.175154521 +0000 UTC m=+0.086000791 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.421 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.757 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.757 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.758 182627 DEBUG nova.objects.instance [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'flavor' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.858 182627 DEBUG nova.compute.manager [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.859 182627 DEBUG nova.compute.manager [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing instance network info cache due to event network-changed-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.859 182627 DEBUG oslo_concurrency.lockutils [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.859 182627 DEBUG oslo_concurrency.lockutils [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:43 np0005592767 nova_compute[182623]: 2026-01-22 22:24:43.859 182627 DEBUG nova.network.neutron [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:24:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:44.341 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:44.343 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:24:44 np0005592767 nova_compute[182623]: 2026-01-22 22:24:44.346 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:44 np0005592767 nova_compute[182623]: 2026-01-22 22:24:44.787 182627 DEBUG nova.objects.instance [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'pci_requests' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:44 np0005592767 nova_compute[182623]: 2026-01-22 22:24:44.854 182627 DEBUG nova.network.neutron [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:24:45 np0005592767 nova_compute[182623]: 2026-01-22 22:24:45.161 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:45 np0005592767 nova_compute[182623]: 2026-01-22 22:24:45.319 182627 DEBUG nova.policy [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee60032f6e6844d2b5f32dec17c83e5b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d4efed580924587923a2cc36dca176f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:24:45 np0005592767 nova_compute[182623]: 2026-01-22 22:24:45.793 182627 DEBUG nova.network.neutron [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updated VIF entry in instance network info cache for port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:45 np0005592767 nova_compute[182623]: 2026-01-22 22:24:45.795 182627 DEBUG nova.network.neutron [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:45 np0005592767 nova_compute[182623]: 2026-01-22 22:24:45.816 182627 DEBUG oslo_concurrency.lockutils [req-45266c17-de85-4c06-91ca-ffaf85148da0 req-20db6fd2-f34f-484b-95a2-2da6dd929b98 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.780 182627 DEBUG nova.network.neutron [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Successfully updated port: cac43648-6264-48e4-bc3b-218b5df95632 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.794 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.795 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.795 182627 DEBUG nova.network.neutron [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.880 182627 DEBUG nova.compute.manager [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-changed-cac43648-6264-48e4-bc3b-218b5df95632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.881 182627 DEBUG nova.compute.manager [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing instance network info cache due to event network-changed-cac43648-6264-48e4-bc3b-218b5df95632. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.881 182627 DEBUG oslo_concurrency.lockutils [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:46 np0005592767 nova_compute[182623]: 2026-01-22 22:24:46.991 182627 WARNING nova.network.neutron [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] 500a5e98-86da-4709-b96b-eead8466e9b7 already exists in list: networks containing: ['500a5e98-86da-4709-b96b-eead8466e9b7']. ignoring it#033[00m
Jan 22 17:24:48 np0005592767 podman[218543]: 2026-01-22 22:24:48.120462598 +0000 UTC m=+0.044232770 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:24:48 np0005592767 nova_compute[182623]: 2026-01-22 22:24:48.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:49.346 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.082 182627 DEBUG nova.network.neutron [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.109 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.110 182627 DEBUG oslo_concurrency.lockutils [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.110 182627 DEBUG nova.network.neutron [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Refreshing network info cache for port cac43648-6264-48e4-bc3b-218b5df95632 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.114 182627 DEBUG nova.virt.libvirt.vif [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.114 182627 DEBUG nova.network.os_vif_util [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.116 182627 DEBUG nova.network.os_vif_util [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.116 182627 DEBUG os_vif [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.117 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.118 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.121 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.122 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcac43648-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.122 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcac43648-62, col_values=(('external_ids', {'iface-id': 'cac43648-6264-48e4-bc3b-218b5df95632', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:29:29', 'vm-uuid': 'a8a23884-c76d-4690-a418-d67ad5bd459c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.161 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 NetworkManager[54973]: <info>  [1769120690.1622] manager: (tapcac43648-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.174 182627 INFO os_vif [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62')#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.175 182627 DEBUG nova.virt.libvirt.vif [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.176 182627 DEBUG nova.network.os_vif_util [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.177 182627 DEBUG nova.network.os_vif_util [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.179 182627 DEBUG nova.virt.libvirt.guest [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:5d:29:29"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <target dev="tapcac43648-62"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:24:50 np0005592767 nova_compute[182623]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 22 17:24:50 np0005592767 kernel: tapcac43648-62: entered promiscuous mode
Jan 22 17:24:50 np0005592767 NetworkManager[54973]: <info>  [1769120690.1960] manager: (tapcac43648-62): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Jan 22 17:24:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:50Z|00185|binding|INFO|Claiming lport cac43648-6264-48e4-bc3b-218b5df95632 for this chassis.
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.195 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:50Z|00186|binding|INFO|cac43648-6264-48e4-bc3b-218b5df95632: Claiming fa:16:3e:5d:29:29 10.100.0.4
Jan 22 17:24:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:50Z|00187|binding|INFO|Setting lport cac43648-6264-48e4-bc3b-218b5df95632 ovn-installed in OVS
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.216 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.221 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:50Z|00188|binding|INFO|Setting lport cac43648-6264-48e4-bc3b-218b5df95632 up in Southbound
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.223 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:29:29 10.100.0.4'], port_security=['fa:16:3e:5d:29:29 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1842659854', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a8a23884-c76d-4690-a418-d67ad5bd459c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-500a5e98-86da-4709-b96b-eead8466e9b7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1842659854', 'neutron:project_id': '4d4efed580924587923a2cc36dca176f', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'ddc68bcb-a294-40eb-ac23-96c3e9f4dbd8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93e1ed03-3d53-4a6c-95c4-ab8774c4d51c, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cac43648-6264-48e4-bc3b-218b5df95632) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.224 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cac43648-6264-48e4-bc3b-218b5df95632 in datapath 500a5e98-86da-4709-b96b-eead8466e9b7 bound to our chassis#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.226 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 500a5e98-86da-4709-b96b-eead8466e9b7#033[00m
Jan 22 17:24:50 np0005592767 systemd-udevd[218574]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.241 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[14d09f81-e979-42a1-8792-0c3a311c095d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 NetworkManager[54973]: <info>  [1769120690.2563] device (tapcac43648-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:24:50 np0005592767 NetworkManager[54973]: <info>  [1769120690.2571] device (tapcac43648-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.278 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4f303b-5015-4875-b1c2-8dadda7711a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.281 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7497eaa5-3e85-41d5-bc14-97de00d5df22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.291 182627 DEBUG nova.virt.libvirt.driver [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.292 182627 DEBUG nova.virt.libvirt.driver [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.292 182627 DEBUG nova.virt.libvirt.driver [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No VIF found with MAC fa:16:3e:87:37:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.292 182627 DEBUG nova.virt.libvirt.driver [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] No VIF found with MAC fa:16:3e:5d:29:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.306 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[67b5535c-0155-4501-af32-dc6764d29e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.321 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[af9cb19c-f1bd-44bb-b8e3-0a02c1edcc42]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap500a5e98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:85:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425088, 'reachable_time': 25429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218581, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.324 182627 DEBUG nova.virt.libvirt.guest [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:name>tempest-tempest.common.compute-instance-1726954682</nova:name>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:24:50</nova:creationTime>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:user uuid="ee60032f6e6844d2b5f32dec17c83e5b">tempest-AttachInterfacesTestJSON-1827119457-project-member</nova:user>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:project uuid="4d4efed580924587923a2cc36dca176f">tempest-AttachInterfacesTestJSON-1827119457</nova:project>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:port uuid="3162b3ad-4f6d-4a7d-9f3b-a752bedd5395">
Jan 22 17:24:50 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    <nova:port uuid="cac43648-6264-48e4-bc3b-218b5df95632">
Jan 22 17:24:50 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:50 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:24:50 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:24:50 np0005592767 nova_compute[182623]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.334 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3f30f377-5909-4d37-ab09-5f2e68ced197]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap500a5e98-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425098, 'tstamp': 425098}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218582, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap500a5e98-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425101, 'tstamp': 425101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218582, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.338 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap500a5e98-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.340 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.341 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.341 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap500a5e98-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.342 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.342 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap500a5e98-80, col_values=(('external_ids', {'iface-id': 'f0c70c85-5528-4622-8533-7199b27961a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:50.343 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:50 np0005592767 nova_compute[182623]: 2026-01-22 22:24:50.566 182627 DEBUG oslo_concurrency.lockutils [None req-fa176119-81fb-4a35-a101-e369c596a3a1 ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.491 182627 DEBUG nova.compute.manager [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.493 182627 DEBUG oslo_concurrency.lockutils [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.493 182627 DEBUG oslo_concurrency.lockutils [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.494 182627 DEBUG oslo_concurrency.lockutils [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.494 182627 DEBUG nova.compute.manager [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.494 182627 WARNING nova.compute.manager [req-944112f4-82c4-42e6-a0a0-f17dc802e828 req-00892514-2e39-470f-98a2-7291d18c9d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.840 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:51 np0005592767 nova_compute[182623]: 2026-01-22 22:24:51.841 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.024 182627 DEBUG nova.objects.instance [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'flavor' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.132 182627 DEBUG nova.virt.libvirt.vif [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.132 182627 DEBUG nova.network.os_vif_util [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.133 182627 DEBUG nova.network.os_vif_util [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.136 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.137 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.139 182627 DEBUG nova.virt.libvirt.driver [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Attempting to detach device tapcac43648-62 from instance a8a23884-c76d-4690-a418-d67ad5bd459c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.139 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:5d:29:29"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <target dev="tapcac43648-62"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.160 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.163 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <name>instance-00000033</name>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <uuid>a8a23884-c76d-4690-a418-d67ad5bd459c</uuid>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:name>tempest-tempest.common.compute-instance-1726954682</nova:name>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:24:50</nova:creationTime>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:user uuid="ee60032f6e6844d2b5f32dec17c83e5b">tempest-AttachInterfacesTestJSON-1827119457-project-member</nova:user>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:project uuid="4d4efed580924587923a2cc36dca176f">tempest-AttachInterfacesTestJSON-1827119457</nova:project>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:port uuid="3162b3ad-4f6d-4a7d-9f3b-a752bedd5395">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:port uuid="cac43648-6264-48e4-bc3b-218b5df95632">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <memory unit='KiB'>131072</memory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <vcpu placement='static'>1</vcpu>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <resource>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <partition>/machine</partition>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </resource>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <sysinfo type='smbios'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='manufacturer'>RDO</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='serial'>a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='uuid'>a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='family'>Virtual Machine</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <boot dev='hd'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <smbios mode='sysinfo'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <vmcoreinfo state='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <model fallback='forbid'>Nehalem</model>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='x2apic'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='hypervisor'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='vme'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <clock offset='utc'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='hpet' present='no'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_poweroff>destroy</on_poweroff>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_reboot>restart</on_reboot>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_crash>destroy</on_crash>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <disk type='file' device='disk'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk' index='2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backingStore type='file' index='3'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <format type='raw'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <backingStore/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      </backingStore>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='vda' bus='virtio'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='virtio-disk0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <disk type='file' device='cdrom'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config' index='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backingStore/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='sda' bus='sata'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <readonly/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='sata0-0-0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pcie.0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='1' port='0x10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='2' port='0x11'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='3' port='0x12'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='4' port='0x13'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='5' port='0x14'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='6' port='0x15'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='7' port='0x16'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='8' port='0x17'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.8'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='9' port='0x18'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.9'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='10' port='0x19'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='11' port='0x1a'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.11'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='12' port='0x1b'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.12'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='13' port='0x1c'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.13'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='14' port='0x1d'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.14'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='15' port='0x1e'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.15'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='16' port='0x1f'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.16'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='17' port='0x20'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.17'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='18' port='0x21'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.18'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='19' port='0x22'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.19'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='20' port='0x23'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.20'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='21' port='0x24'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.21'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='22' port='0x25'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.22'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='23' port='0x26'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.23'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='24' port='0x27'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.24'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='25' port='0x28'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.25'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-pci-bridge'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.26'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='usb'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='sata' index='0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='ide'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:87:37:a6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='tap3162b3ad-4f'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='net0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:5d:29:29'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='tapcac43648-62'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='net1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <serial type='pty'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log' append='off'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target type='isa-serial' port='0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <model name='isa-serial'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      </target>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log' append='off'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target type='serial' port='0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='tablet' bus='usb'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='usb' bus='0' port='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='mouse' bus='ps2'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='keyboard' bus='ps2'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <listen type='address' address='::0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <audio id='1' type='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='video0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <watchdog model='itco' action='reset'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='watchdog0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </watchdog>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <memballoon model='virtio'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <stats period='10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='balloon0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <rng model='virtio'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backend model='random'>/dev/urandom</backend>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='rng0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <label>system_u:system_r:svirt_t:s0:c380,c846</label>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c380,c846</imagelabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <label>+107:+107</label>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <imagelabel>+107:+107</imagelabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.166 182627 INFO nova.virt.libvirt.driver [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully detached device tapcac43648-62 from instance a8a23884-c76d-4690-a418-d67ad5bd459c from the persistent domain config.#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.166 182627 DEBUG nova.virt.libvirt.driver [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] (1/8): Attempting to detach device tapcac43648-62 with device alias net1 from instance a8a23884-c76d-4690-a418-d67ad5bd459c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.167 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:5d:29:29"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <target dev="tapcac43648-62"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 17:24:52 np0005592767 kernel: tapcac43648-62 (unregistering): left promiscuous mode
Jan 22 17:24:52 np0005592767 NetworkManager[54973]: <info>  [1769120692.2786] device (tapcac43648-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.281 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:52Z|00189|binding|INFO|Releasing lport cac43648-6264-48e4-bc3b-218b5df95632 from this chassis (sb_readonly=0)
Jan 22 17:24:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:52Z|00190|binding|INFO|Setting lport cac43648-6264-48e4-bc3b-218b5df95632 down in Southbound
Jan 22 17:24:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:52Z|00191|binding|INFO|Removing iface tapcac43648-62 ovn-installed in OVS
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.288 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:29:29 10.100.0.4'], port_security=['fa:16:3e:5d:29:29 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1842659854', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a8a23884-c76d-4690-a418-d67ad5bd459c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-500a5e98-86da-4709-b96b-eead8466e9b7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1842659854', 'neutron:project_id': '4d4efed580924587923a2cc36dca176f', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'ddc68bcb-a294-40eb-ac23-96c3e9f4dbd8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93e1ed03-3d53-4a6c-95c4-ab8774c4d51c, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cac43648-6264-48e4-bc3b-218b5df95632) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.289 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cac43648-6264-48e4-bc3b-218b5df95632 in datapath 500a5e98-86da-4709-b96b-eead8466e9b7 unbound from our chassis#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.291 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 500a5e98-86da-4709-b96b-eead8466e9b7#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.298 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.305 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[aa53702a-1ff9-4477-abdc-8dbc871fe7f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.315 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Received event <DeviceRemovedEvent: 1769120692.3147755, a8a23884-c76d-4690-a418-d67ad5bd459c => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.316 182627 DEBUG nova.virt.libvirt.driver [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Start waiting for the detach event from libvirt for device tapcac43648-62 with device alias net1 for instance a8a23884-c76d-4690-a418-d67ad5bd459c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.317 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.320 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5d:29:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapcac43648-62"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <name>instance-00000033</name>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <uuid>a8a23884-c76d-4690-a418-d67ad5bd459c</uuid>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:name>tempest-tempest.common.compute-instance-1726954682</nova:name>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:24:50</nova:creationTime>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:user uuid="ee60032f6e6844d2b5f32dec17c83e5b">tempest-AttachInterfacesTestJSON-1827119457-project-member</nova:user>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:project uuid="4d4efed580924587923a2cc36dca176f">tempest-AttachInterfacesTestJSON-1827119457</nova:project>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:port uuid="3162b3ad-4f6d-4a7d-9f3b-a752bedd5395">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:port uuid="cac43648-6264-48e4-bc3b-218b5df95632">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <memory unit='KiB'>131072</memory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <vcpu placement='static'>1</vcpu>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <resource>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <partition>/machine</partition>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </resource>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <sysinfo type='smbios'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='manufacturer'>RDO</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='serial'>a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='uuid'>a8a23884-c76d-4690-a418-d67ad5bd459c</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <entry name='family'>Virtual Machine</entry>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <boot dev='hd'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <smbios mode='sysinfo'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <vmcoreinfo state='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <model fallback='forbid'>Nehalem</model>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='x2apic'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='hypervisor'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <feature policy='require' name='vme'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <clock offset='utc'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <timer name='hpet' present='no'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_poweroff>destroy</on_poweroff>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_reboot>restart</on_reboot>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <on_crash>destroy</on_crash>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <disk type='file' device='disk'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk' index='2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backingStore type='file' index='3'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <format type='raw'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <backingStore/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      </backingStore>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='vda' bus='virtio'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='virtio-disk0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <disk type='file' device='cdrom'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/disk.config' index='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backingStore/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='sda' bus='sata'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <readonly/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='sata0-0-0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pcie.0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='1' port='0x10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='2' port='0x11'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='3' port='0x12'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='4' port='0x13'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='5' port='0x14'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='6' port='0x15'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='7' port='0x16'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='8' port='0x17'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.8'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='9' port='0x18'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.9'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='10' port='0x19'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='11' port='0x1a'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.11'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='12' port='0x1b'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.12'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='13' port='0x1c'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.13'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='14' port='0x1d'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.14'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='15' port='0x1e'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.15'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='16' port='0x1f'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.16'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='17' port='0x20'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.17'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='18' port='0x21'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.18'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='19' port='0x22'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.19'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='20' port='0x23'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.20'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='21' port='0x24'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.21'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='22' port='0x25'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.22'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='23' port='0x26'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.23'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='24' port='0x27'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.24'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target chassis='25' port='0x28'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.25'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model name='pcie-pci-bridge'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='pci.26'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='usb'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <controller type='sata' index='0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='ide'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:87:37:a6'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target dev='tap3162b3ad-4f'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='net0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <serial type='pty'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log' append='off'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target type='isa-serial' port='0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:        <model name='isa-serial'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      </target>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c/console.log' append='off'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <target type='serial' port='0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='tablet' bus='usb'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='usb' bus='0' port='1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='mouse' bus='ps2'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input1'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <input type='keyboard' bus='ps2'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='input2'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <listen type='address' address='::0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <audio id='1' type='none'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='video0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <watchdog model='itco' action='reset'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='watchdog0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </watchdog>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <memballoon model='virtio'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <stats period='10'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='balloon0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <rng model='virtio'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <backend model='random'>/dev/urandom</backend>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <alias name='rng0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <label>system_u:system_r:svirt_t:s0:c380,c846</label>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c380,c846</imagelabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <label>+107:+107</label>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <imagelabel>+107:+107</imagelabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.320 182627 INFO nova.virt.libvirt.driver [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully detached device tapcac43648-62 from instance a8a23884-c76d-4690-a418-d67ad5bd459c from the live domain config.#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.321 182627 DEBUG nova.virt.libvirt.vif [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.321 182627 DEBUG nova.network.os_vif_util [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.322 182627 DEBUG nova.network.os_vif_util [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.323 182627 DEBUG os_vif [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.324 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcac43648-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.326 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.328 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.329 182627 INFO os_vif [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62')#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.330 182627 DEBUG nova.virt.libvirt.guest [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:name>tempest-tempest.common.compute-instance-1726954682</nova:name>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:24:52</nova:creationTime>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:user uuid="ee60032f6e6844d2b5f32dec17c83e5b">tempest-AttachInterfacesTestJSON-1827119457-project-member</nova:user>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:project uuid="4d4efed580924587923a2cc36dca176f">tempest-AttachInterfacesTestJSON-1827119457</nova:project>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    <nova:port uuid="3162b3ad-4f6d-4a7d-9f3b-a752bedd5395">
Jan 22 17:24:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:24:52 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:24:52 np0005592767 nova_compute[182623]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.333 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e43934-f6d5-4701-a72e-7d868d4c77af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.336 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7c200757-6b53-495c-8eb6-1b66f87af64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.359 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[aef29b8d-f81d-44dc-9248-6a66b78d2ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.378 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[64633751-f17f-42e2-ad27-e171fb69b5d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap500a5e98-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:85:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425088, 'reachable_time': 25429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218593, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.395 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[058bfffe-acf3-443b-805d-bbd946891ffc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap500a5e98-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425098, 'tstamp': 425098}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218594, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap500a5e98-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425101, 'tstamp': 425101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218594, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.397 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap500a5e98-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.444 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.446 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap500a5e98-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.447 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.447 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap500a5e98-80, col_values=(('external_ids', {'iface-id': 'f0c70c85-5528-4622-8533-7199b27961a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:52.447 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.842 182627 DEBUG nova.network.neutron [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updated VIF entry in instance network info cache for port cac43648-6264-48e4-bc3b-218b5df95632. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.842 182627 DEBUG nova.network.neutron [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:52 np0005592767 nova_compute[182623]: 2026-01-22 22:24:52.879 182627 DEBUG oslo_concurrency.lockutils [req-091bbe62-6012-4e8b-bdb0-765b22cacefb req-8fd143d7-830c-491d-a714-9a7d06a4520b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.616 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.616 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.616 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 WARNING nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-unplugged-cac43648-6264-48e4-bc3b-218b5df95632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.617 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-unplugged-cac43648-6264-48e4-bc3b-218b5df95632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 WARNING nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-unplugged-cac43648-6264-48e4-bc3b-218b5df95632 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.618 182627 DEBUG oslo_concurrency.lockutils [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.619 182627 DEBUG nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:53 np0005592767 nova_compute[182623]: 2026-01-22 22:24:53.619 182627 WARNING nova.compute.manager [req-a063b932-ef8e-4f1d-98b6-920d1667979c req-06bcff89-3efb-47da-953a-7bc501e32712 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-plugged-cac43648-6264-48e4-bc3b-218b5df95632 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.143 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.143 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquired lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.144 182627 DEBUG nova.network.neutron [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.345 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.345 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.345 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.346 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.346 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.357 182627 INFO nova.compute.manager [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Terminating instance#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.368 182627 DEBUG nova.compute.manager [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:24:55 np0005592767 kernel: tap3162b3ad-4f (unregistering): left promiscuous mode
Jan 22 17:24:55 np0005592767 NetworkManager[54973]: <info>  [1769120695.3984] device (tap3162b3ad-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:24:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:55Z|00192|binding|INFO|Releasing lport 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 from this chassis (sb_readonly=0)
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.408 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:55Z|00193|binding|INFO|Setting lport 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 down in Southbound
Jan 22 17:24:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:24:55Z|00194|binding|INFO|Removing iface tap3162b3ad-4f ovn-installed in OVS
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.410 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.426 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.434 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:37:a6 10.100.0.3'], port_security=['fa:16:3e:87:37:a6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a8a23884-c76d-4690-a418-d67ad5bd459c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-500a5e98-86da-4709-b96b-eead8466e9b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4efed580924587923a2cc36dca176f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '928ee6fb-b498-4b31-9898-b1b27063fabb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93e1ed03-3d53-4a6c-95c4-ab8774c4d51c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.435 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 in datapath 500a5e98-86da-4709-b96b-eead8466e9b7 unbound from our chassis#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.437 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 500a5e98-86da-4709-b96b-eead8466e9b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.439 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f6df0f88-3b61-463d-ac32-59cf3f5e42d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.439 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7 namespace which is not needed anymore#033[00m
Jan 22 17:24:55 np0005592767 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 22 17:24:55 np0005592767 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Consumed 13.755s CPU time.
Jan 22 17:24:55 np0005592767 systemd-machined[153912]: Machine qemu-27-instance-00000033 terminated.
Jan 22 17:24:55 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [NOTICE]   (218372) : haproxy version is 2.8.14-c23fe91
Jan 22 17:24:55 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [NOTICE]   (218372) : path to executable is /usr/sbin/haproxy
Jan 22 17:24:55 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [WARNING]  (218372) : Exiting Master process...
Jan 22 17:24:55 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [ALERT]    (218372) : Current worker (218374) exited with code 143 (Terminated)
Jan 22 17:24:55 np0005592767 neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7[218368]: [WARNING]  (218372) : All workers exited. Exiting... (0)
Jan 22 17:24:55 np0005592767 systemd[1]: libpod-e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5.scope: Deactivated successfully.
Jan 22 17:24:55 np0005592767 podman[218620]: 2026-01-22 22:24:55.615289958 +0000 UTC m=+0.049318502 container died e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.625 182627 INFO nova.virt.libvirt.driver [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Instance destroyed successfully.#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.626 182627 DEBUG nova.objects.instance [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lazy-loading 'resources' on Instance uuid a8a23884-c76d-4690-a418-d67ad5bd459c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.642 182627 DEBUG nova.virt.libvirt.vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.642 182627 DEBUG nova.network.os_vif_util [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.643 182627 DEBUG nova.network.os_vif_util [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.643 182627 DEBUG os_vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.644 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.644 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3162b3ad-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.646 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.647 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.649 182627 INFO os_vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:37:a6,bridge_name='br-int',has_traffic_filtering=True,id=3162b3ad-4f6d-4a7d-9f3b-a752bedd5395,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3162b3ad-4f')#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.651 182627 DEBUG nova.virt.libvirt.vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1726954682',display_name='tempest-tempest.common.compute-instance-1726954682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1726954682',id=51,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOpkMByWZKnazLsHU4zNqY5asJ46LAdtsmfj4QT55iHsSLAbScbrxThrxciHJPlXF8AcafHHksIc6RU7ydQ+Pgd5qzkMslHvE8pL5agVxjpfxfiiZMCTqfDPks6h50F2LA==',key_name='tempest-keypair-2013247231',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4efed580924587923a2cc36dca176f',ramdisk_id='',reservation_id='r-ouoruzi3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1827119457',owner_user_name='tempest-AttachInterfacesTestJSON-1827119457-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:24:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee60032f6e6844d2b5f32dec17c83e5b',uuid=a8a23884-c76d-4690-a418-d67ad5bd459c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.652 182627 DEBUG nova.network.os_vif_util [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converting VIF {"id": "cac43648-6264-48e4-bc3b-218b5df95632", "address": "fa:16:3e:5d:29:29", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac43648-62", "ovs_interfaceid": "cac43648-6264-48e4-bc3b-218b5df95632", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.654 182627 DEBUG nova.network.os_vif_util [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.655 182627 DEBUG os_vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.658 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.658 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcac43648-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5-userdata-shm.mount: Deactivated successfully.
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.658 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.660 182627 INFO os_vif [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:29:29,bridge_name='br-int',has_traffic_filtering=True,id=cac43648-6264-48e4-bc3b-218b5df95632,network=Network(500a5e98-86da-4709-b96b-eead8466e9b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcac43648-62')#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.661 182627 INFO nova.virt.libvirt.driver [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Deleting instance files /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c_del#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.661 182627 INFO nova.virt.libvirt.driver [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Deletion of /var/lib/nova/instances/a8a23884-c76d-4690-a418-d67ad5bd459c_del complete#033[00m
Jan 22 17:24:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay-abd6d8819917eb304fb27d03e7a330ea59196d8d86bce626e0b6fdb12da29565-merged.mount: Deactivated successfully.
Jan 22 17:24:55 np0005592767 podman[218620]: 2026-01-22 22:24:55.674322169 +0000 UTC m=+0.108350713 container cleanup e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:24:55 np0005592767 systemd[1]: libpod-conmon-e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5.scope: Deactivated successfully.
Jan 22 17:24:55 np0005592767 podman[218666]: 2026-01-22 22:24:55.771283154 +0000 UTC m=+0.063171867 container remove e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.779 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d7693850-ada9-4401-ac9e-b518b03490fb]: (4, ('Thu Jan 22 10:24:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7 (e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5)\ne83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5\nThu Jan 22 10:24:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7 (e83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5)\ne83fb53cda3295170fbe1e6d10539209104900cfc6f3478cfc2d483d0c35b0d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.781 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[33111be5-2cfc-4ffa-b2aa-36c99f63cf25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.782 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap500a5e98-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:24:55 np0005592767 kernel: tap500a5e98-80: left promiscuous mode
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.785 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.792 182627 INFO nova.compute.manager [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.792 182627 DEBUG oslo.service.loopingcall [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.793 182627 DEBUG nova.compute.manager [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.793 182627 DEBUG nova.network.neutron [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.795 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 nova_compute[182623]: 2026-01-22 22:24:55.796 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.799 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75b1eb08-39bc-4de9-b57b-24a9c6d8c8bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.815 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[098938e2-a257-4735-9007-ce6831c799c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.817 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c89d3bf-6567-4707-8370-4ccfaa4fd474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.832 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56e0626b-6000-4776-8f8d-f803124f6069]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425079, 'reachable_time': 29295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218681, 'error': None, 'target': 'ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.835 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-500a5e98-86da-4709-b96b-eead8466e9b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:24:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:24:55.835 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[390190e8-333d-47d2-b2bb-7cc0407012be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:24:55 np0005592767 systemd[1]: run-netns-ovnmeta\x2d500a5e98\x2d86da\x2d4709\x2db96b\x2deead8466e9b7.mount: Deactivated successfully.
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.459 182627 DEBUG nova.compute.manager [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-unplugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.460 182627 DEBUG oslo_concurrency.lockutils [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.461 182627 DEBUG oslo_concurrency.lockutils [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.461 182627 DEBUG oslo_concurrency.lockutils [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.461 182627 DEBUG nova.compute.manager [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-unplugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:24:56 np0005592767 nova_compute[182623]: 2026-01-22 22:24:56.461 182627 DEBUG nova.compute.manager [req-27b8d988-a072-4f5e-9af6-9da6eb64a49c req-d322ff2d-5303-4581-8c69-3ccbe49e6487 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-unplugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:24:57 np0005592767 nova_compute[182623]: 2026-01-22 22:24:57.389 182627 INFO nova.network.neutron [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Port cac43648-6264-48e4-bc3b-218b5df95632 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 22 17:24:57 np0005592767 nova_compute[182623]: 2026-01-22 22:24:57.390 182627 DEBUG nova.network.neutron [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [{"id": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "address": "fa:16:3e:87:37:a6", "network": {"id": "500a5e98-86da-4709-b96b-eead8466e9b7", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1258452122-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d4efed580924587923a2cc36dca176f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3162b3ad-4f", "ovs_interfaceid": "3162b3ad-4f6d-4a7d-9f3b-a752bedd5395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:24:57 np0005592767 nova_compute[182623]: 2026-01-22 22:24:57.448 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Releasing lock "refresh_cache-a8a23884-c76d-4690-a418-d67ad5bd459c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:24:57 np0005592767 nova_compute[182623]: 2026-01-22 22:24:57.610 182627 DEBUG oslo_concurrency.lockutils [None req-5c6b920e-b818-4ca4-9ddd-7508361f534c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "interface-a8a23884-c76d-4690-a418-d67ad5bd459c-cac43648-6264-48e4-bc3b-218b5df95632" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.042 182627 DEBUG nova.compute.manager [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.043 182627 DEBUG oslo_concurrency.lockutils [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.043 182627 DEBUG oslo_concurrency.lockutils [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.044 182627 DEBUG oslo_concurrency.lockutils [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.045 182627 DEBUG nova.compute.manager [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] No waiting events found dispatching network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.045 182627 WARNING nova.compute.manager [req-880f1eee-5e58-4582-b3d1-c83db53585ca req-09e42af1-3abd-4827-8197-8c58dd8f8b93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received unexpected event network-vif-plugged-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.178 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.183 182627 DEBUG nova.network.neutron [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:25:00 np0005592767 podman[218682]: 2026-01-22 22:25:00.187588367 +0000 UTC m=+0.093917422 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.234 182627 INFO nova.compute.manager [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Took 4.44 seconds to deallocate network for instance.#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.344 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.344 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.394 182627 DEBUG nova.compute.manager [req-c57591a2-93bf-49a8-a24e-d3c1100a6806 req-88cad791-b256-44c2-a96a-1efbfb42f989 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Received event network-vif-deleted-3162b3ad-4f6d-4a7d-9f3b-a752bedd5395 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.472 182627 DEBUG nova.compute.provider_tree [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.505 182627 DEBUG nova.scheduler.client.report [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.536 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.584 182627 INFO nova.scheduler.client.report [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Deleted allocations for instance a8a23884-c76d-4690-a418-d67ad5bd459c#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.646 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:00 np0005592767 nova_compute[182623]: 2026-01-22 22:25:00.671 182627 DEBUG oslo_concurrency.lockutils [None req-97f7f8ca-78ab-4e09-9b3b-a52b0b4c6f6c ee60032f6e6844d2b5f32dec17c83e5b 4d4efed580924587923a2cc36dca176f - - default default] Lock "a8a23884-c76d-4690-a418-d67ad5bd459c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:05 np0005592767 podman[218703]: 2026-01-22 22:25:05.148732763 +0000 UTC m=+0.064855604 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:25:05 np0005592767 nova_compute[182623]: 2026-01-22 22:25:05.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:05 np0005592767 podman[218702]: 2026-01-22 22:25:05.197135807 +0000 UTC m=+0.107962141 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:25:05 np0005592767 nova_compute[182623]: 2026-01-22 22:25:05.689 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.181 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.301 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.302 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.327 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.626 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120695.6236222, a8a23884-c76d-4690-a418-d67ad5bd459c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.627 182627 INFO nova.compute.manager [-] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.658 182627 DEBUG nova.compute.manager [None req-366b117e-f646-4c43-a29a-022ac1276cef - - - - - -] [instance: a8a23884-c76d-4690-a418-d67ad5bd459c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.679 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.679 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.691 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.692 182627 INFO nova.compute.claims [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.876 182627 DEBUG nova.compute.provider_tree [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.890 182627 DEBUG nova.scheduler.client.report [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.908 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.920 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.921 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.982 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:25:10 np0005592767 nova_compute[182623]: 2026-01-22 22:25:10.982 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:25:11 np0005592767 nova_compute[182623]: 2026-01-22 22:25:11.004 182627 INFO nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:25:11 np0005592767 nova_compute[182623]: 2026-01-22 22:25:11.032 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:25:11 np0005592767 nova_compute[182623]: 2026-01-22 22:25:11.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:11 np0005592767 nova_compute[182623]: 2026-01-22 22:25:11.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:11 np0005592767 nova_compute[182623]: 2026-01-22 22:25:11.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:25:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:12.096 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:12.096 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:12.096 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:12 np0005592767 podman[218749]: 2026-01-22 22:25:12.127990693 +0000 UTC m=+0.051667217 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 22 17:25:12 np0005592767 podman[218750]: 2026-01-22 22:25:12.138287859 +0000 UTC m=+0.054077634 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:25:12 np0005592767 nova_compute[182623]: 2026-01-22 22:25:12.154 182627 DEBUG nova.policy [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95cf9999380d48108a561554c1897f15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:25:13 np0005592767 nova_compute[182623]: 2026-01-22 22:25:13.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:13 np0005592767 nova_compute[182623]: 2026-01-22 22:25:13.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:13 np0005592767 nova_compute[182623]: 2026-01-22 22:25:13.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:25:13 np0005592767 nova_compute[182623]: 2026-01-22 22:25:13.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.860 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.862 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.863 182627 INFO nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Creating image(s)#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.864 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.865 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.866 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.894 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.894 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.895 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.896 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.917 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.974 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.975 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.976 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:14 np0005592767 nova_compute[182623]: 2026-01-22 22:25:14.986 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.045 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.046 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.079 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.080 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.080 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.138 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.140 182627 DEBUG nova.virt.disk.api [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Checking if we can resize image /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.141 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.184 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.199 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.200 182627 DEBUG nova.virt.disk.api [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Cannot resize image /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.200 182627 DEBUG nova.objects.instance [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'migration_context' on Instance uuid a892917e-4b80-4d70-9dc6-3b242345fa9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.215 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.215 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Ensure instance console log exists: /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.216 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.216 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.216 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.238 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Successfully created port: 884ca156-0df5-4310-9173-b39b523419c2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:15 np0005592767 nova_compute[182623]: 2026-01-22 22:25:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.391 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.391 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.392 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.392 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.645 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.646 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.23573684692383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.646 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.646 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.726 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance a892917e-4b80-4d70-9dc6-3b242345fa9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.727 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.727 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.836 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.850 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Successfully updated port: 884ca156-0df5-4310-9173-b39b523419c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.852 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.875 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.875 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.875 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.880 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:25:17 np0005592767 nova_compute[182623]: 2026-01-22 22:25:17.880 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:18 np0005592767 nova_compute[182623]: 2026-01-22 22:25:18.078 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:25:18 np0005592767 nova_compute[182623]: 2026-01-22 22:25:18.459 182627 DEBUG nova.compute.manager [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-changed-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:18 np0005592767 nova_compute[182623]: 2026-01-22 22:25:18.459 182627 DEBUG nova.compute.manager [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Refreshing instance network info cache due to event network-changed-884ca156-0df5-4310-9173-b39b523419c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:25:18 np0005592767 nova_compute[182623]: 2026-01-22 22:25:18.460 182627 DEBUG oslo_concurrency.lockutils [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:25:19 np0005592767 podman[218806]: 2026-01-22 22:25:19.132973599 +0000 UTC m=+0.053475688 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.785 182627 DEBUG nova.network.neutron [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updating instance_info_cache with network_info: [{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.828 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.829 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance network_info: |[{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.829 182627 DEBUG oslo_concurrency.lockutils [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.829 182627 DEBUG nova.network.neutron [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Refreshing network info cache for port 884ca156-0df5-4310-9173-b39b523419c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.831 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Start _get_guest_xml network_info=[{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.834 182627 WARNING nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.842 182627 DEBUG nova.virt.libvirt.host [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.843 182627 DEBUG nova.virt.libvirt.host [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.846 182627 DEBUG nova.virt.libvirt.host [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.847 182627 DEBUG nova.virt.libvirt.host [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.849 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.849 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.850 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.850 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.851 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.851 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.852 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.852 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.853 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.853 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.853 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.854 182627 DEBUG nova.virt.hardware [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.860 182627 DEBUG nova.virt.libvirt.vif [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:25:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018382080',display_name='tempest-DeleteServersTestJSON-server-1018382080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018382080',id=56,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-rrpcot2p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:25:11Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a892917e-4b80-4d70-9dc6-3b242345fa9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.861 182627 DEBUG nova.network.os_vif_util [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.862 182627 DEBUG nova.network.os_vif_util [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.863 182627 DEBUG nova.objects.instance [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'pci_devices' on Instance uuid a892917e-4b80-4d70-9dc6-3b242345fa9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.875 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.882 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <uuid>a892917e-4b80-4d70-9dc6-3b242345fa9c</uuid>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <name>instance-00000038</name>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:name>tempest-DeleteServersTestJSON-server-1018382080</nova:name>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:25:19</nova:creationTime>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:user uuid="95cf9999380d48108a561554c1897f15">tempest-DeleteServersTestJSON-1655437746-project-member</nova:user>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:project uuid="9f8f780ce45a4950b1666a54cd9a5ba0">tempest-DeleteServersTestJSON-1655437746</nova:project>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        <nova:port uuid="884ca156-0df5-4310-9173-b39b523419c2">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="serial">a892917e-4b80-4d70-9dc6-3b242345fa9c</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="uuid">a892917e-4b80-4d70-9dc6-3b242345fa9c</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.config"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b5:fa:79"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <target dev="tap884ca156-0d"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/console.log" append="off"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:25:19 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:25:19 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:25:19 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:25:19 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.883 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Preparing to wait for external event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.883 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.883 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.884 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.885 182627 DEBUG nova.virt.libvirt.vif [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:25:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018382080',display_name='tempest-DeleteServersTestJSON-server-1018382080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018382080',id=56,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-rrpcot2p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:25:11Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a892917e-4b80-4d70-9dc6-3b242345fa9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.885 182627 DEBUG nova.network.os_vif_util [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.886 182627 DEBUG nova.network.os_vif_util [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.887 182627 DEBUG os_vif [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.888 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.888 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.888 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.895 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.896 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap884ca156-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.896 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap884ca156-0d, col_values=(('external_ids', {'iface-id': '884ca156-0df5-4310-9173-b39b523419c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:fa:79', 'vm-uuid': 'a892917e-4b80-4d70-9dc6-3b242345fa9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.898 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:19 np0005592767 NetworkManager[54973]: <info>  [1769120719.9000] manager: (tap884ca156-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.901 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.904 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.905 182627 INFO os_vif [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d')#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.968 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.969 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.969 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] No VIF found with MAC fa:16:3e:b5:fa:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:25:19 np0005592767 nova_compute[182623]: 2026-01-22 22:25:19.970 182627 INFO nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Using config drive#033[00m
Jan 22 17:25:20 np0005592767 nova_compute[182623]: 2026-01-22 22:25:20.187 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.095 182627 INFO nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Creating config drive at /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.config#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.101 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyvmkr2dn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.240 182627 DEBUG oslo_concurrency.processutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyvmkr2dn" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:21 np0005592767 kernel: tap884ca156-0d: entered promiscuous mode
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.3136] manager: (tap884ca156-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.313 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:21Z|00195|binding|INFO|Claiming lport 884ca156-0df5-4310-9173-b39b523419c2 for this chassis.
Jan 22 17:25:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:21Z|00196|binding|INFO|884ca156-0df5-4310-9173-b39b523419c2: Claiming fa:16:3e:b5:fa:79 10.100.0.6
Jan 22 17:25:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:21Z|00197|binding|INFO|Setting lport 884ca156-0df5-4310-9173-b39b523419c2 ovn-installed in OVS
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.336 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.337 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 systemd-udevd[218847]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.3749] device (tap884ca156-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.3763] device (tap884ca156-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:25:21 np0005592767 systemd-machined[153912]: New machine qemu-28-instance-00000038.
Jan 22 17:25:21 np0005592767 systemd[1]: Started Virtual Machine qemu-28-instance-00000038.
Jan 22 17:25:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:21Z|00198|binding|INFO|Setting lport 884ca156-0df5-4310-9173-b39b523419c2 up in Southbound
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.409 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:fa:79 10.100.0.6'], port_security=['fa:16:3e:b5:fa:79 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a892917e-4b80-4d70-9dc6-3b242345fa9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=884ca156-0df5-4310-9173-b39b523419c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.411 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 884ca156-0df5-4310-9173-b39b523419c2 in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad bound to our chassis#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.412 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.433 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[52d1218a-f7e8-434c-a324-e24b3ec4357f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.434 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap976277ea-61 in ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.438 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap976277ea-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.439 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[39442a6e-3d58-4273-aeed-1fcaf43cedb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.440 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2256fa13-f461-4fd3-9fa3-f6905d0517ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.464 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a05a8d39-d1a2-42c7-8deb-e0527148869e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.495 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd1fa2f-2553-4c46-9a3d-e42b51e9f226]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.535 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc5589e-a8dd-4215-ae0f-1fcf2cf6866a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 systemd-udevd[218852]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.544 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f4359508-b85e-4398-aed2-30f0e21cdb43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.5446] manager: (tap976277ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.584 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5e532c16-5421-47db-9b38-a2d194e064aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.589 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a270cc71-cd33-4708-bea9-c77b1c6415d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.6190] device (tap976277ea-60): carrier: link connected
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.627 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0a45b7d1-1653-420a-b5a9-be537f6440af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.652 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[00881184-3dfc-47bd-818a-483084e43683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431819, 'reachable_time': 26256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218883, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.677 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d1863b40-08e7-4b5c-8914-215e68620759]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:95d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431819, 'tstamp': 431819}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218884, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.699 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3f079178-aadb-448c-bf6a-1c457efc9e48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap976277ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:95:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431819, 'reachable_time': 26256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218885, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.751 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[817d3ef3-7c59-40c8-bacd-215dba335891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.832 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[41bce2d8-b55c-43de-826c-93dd942fb64f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.834 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.834 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.835 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap976277ea-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.836 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 NetworkManager[54973]: <info>  [1769120721.8375] manager: (tap976277ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 22 17:25:21 np0005592767 kernel: tap976277ea-60: entered promiscuous mode
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.839 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.840 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap976277ea-60, col_values=(('external_ids', {'iface-id': '06db452d-91a0-4ebb-b584-a57953634a03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.841 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:21Z|00199|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.858 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 nova_compute[182623]: 2026-01-22 22:25:21.859 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.859 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.860 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c4022f-4afa-4dd0-8563-db11577dae30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.861 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/976277ea-61b2-4223-a8f7-3d46bf9c98ad.pid.haproxy
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 976277ea-61b2-4223-a8f7-3d46bf9c98ad
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:25:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:21.861 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'env', 'PROCESS_TAG=haproxy-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/976277ea-61b2-4223-a8f7-3d46bf9c98ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.228 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120722.2259269, a892917e-4b80-4d70-9dc6-3b242345fa9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.229 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.265 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.269 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120722.2281392, a892917e-4b80-4d70-9dc6-3b242345fa9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.270 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.326 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.329 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:25:22 np0005592767 podman[218924]: 2026-01-22 22:25:22.372692263 +0000 UTC m=+0.106404736 container create 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:25:22 np0005592767 podman[218924]: 2026-01-22 22:25:22.291267904 +0000 UTC m=+0.024980387 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.387 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:25:22 np0005592767 systemd[1]: Started libpod-conmon-7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365.scope.
Jan 22 17:25:22 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:25:22 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16119f92750af278e74b3becc9e62de4d9b3da31a67ba6a1b3fca4528d612733/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.526 182627 DEBUG nova.compute.manager [req-4cb2ee3a-5dbe-4aaf-8796-a80caac5df55 req-a7a287aa-f394-410d-8530-73c26bc79a3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.527 182627 DEBUG oslo_concurrency.lockutils [req-4cb2ee3a-5dbe-4aaf-8796-a80caac5df55 req-a7a287aa-f394-410d-8530-73c26bc79a3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.527 182627 DEBUG oslo_concurrency.lockutils [req-4cb2ee3a-5dbe-4aaf-8796-a80caac5df55 req-a7a287aa-f394-410d-8530-73c26bc79a3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.527 182627 DEBUG oslo_concurrency.lockutils [req-4cb2ee3a-5dbe-4aaf-8796-a80caac5df55 req-a7a287aa-f394-410d-8530-73c26bc79a3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.528 182627 DEBUG nova.compute.manager [req-4cb2ee3a-5dbe-4aaf-8796-a80caac5df55 req-a7a287aa-f394-410d-8530-73c26bc79a3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Processing event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.528 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:25:22 np0005592767 podman[218924]: 2026-01-22 22:25:22.529131381 +0000 UTC m=+0.262843884 container init 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.534 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120722.5342422, a892917e-4b80-4d70-9dc6-3b242345fa9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:25:22 np0005592767 podman[218924]: 2026-01-22 22:25:22.535035157 +0000 UTC m=+0.268747650 container start 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.535 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.537 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.542 182627 INFO nova.virt.libvirt.driver [-] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance spawned successfully.#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.542 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:25:22 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [NOTICE]   (218943) : New worker (218945) forked
Jan 22 17:25:22 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [NOTICE]   (218943) : Loading success.
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.562 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.569 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.572 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.573 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.573 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.574 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.574 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.575 182627 DEBUG nova.virt.libvirt.driver [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.670 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.990 182627 INFO nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Took 8.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:25:22 np0005592767 nova_compute[182623]: 2026-01-22 22:25:22.991 182627 DEBUG nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:23 np0005592767 nova_compute[182623]: 2026-01-22 22:25:23.214 182627 INFO nova.compute.manager [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Took 12.59 seconds to build instance.#033[00m
Jan 22 17:25:23 np0005592767 nova_compute[182623]: 2026-01-22 22:25:23.266 182627 DEBUG nova.network.neutron [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updated VIF entry in instance network info cache for port 884ca156-0df5-4310-9173-b39b523419c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:25:23 np0005592767 nova_compute[182623]: 2026-01-22 22:25:23.268 182627 DEBUG nova.network.neutron [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updating instance_info_cache with network_info: [{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:25:23 np0005592767 nova_compute[182623]: 2026-01-22 22:25:23.272 182627 DEBUG oslo_concurrency.lockutils [None req-256c4cf4-7492-4fca-b797-79348e592d28 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:23 np0005592767 nova_compute[182623]: 2026-01-22 22:25:23.279 182627 DEBUG oslo_concurrency.lockutils [req-463d3f7d-396c-4fdb-843c-5a3b05a27dab req-36f47af9-c9ef-4cb2-ba65-a5ea95cb1d0b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:25:24 np0005592767 nova_compute[182623]: 2026-01-22 22:25:24.900 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.109 182627 DEBUG nova.compute.manager [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.110 182627 DEBUG oslo_concurrency.lockutils [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.110 182627 DEBUG oslo_concurrency.lockutils [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.111 182627 DEBUG oslo_concurrency.lockutils [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.111 182627 DEBUG nova.compute.manager [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] No waiting events found dispatching network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.112 182627 WARNING nova.compute.manager [req-93226788-f355-4ed3-b841-87d8f3a02707 req-c8d9ba8e-ce97-4bdb-906b-f29b8917e681 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received unexpected event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.189 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.949 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.950 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:25 np0005592767 nova_compute[182623]: 2026-01-22 22:25:25.951 182627 INFO nova.compute.manager [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Shelving#033[00m
Jan 22 17:25:26 np0005592767 nova_compute[182623]: 2026-01-22 22:25:26.009 182627 DEBUG nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:25:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:26Z|00200|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 17:25:26 np0005592767 nova_compute[182623]: 2026-01-22 22:25:26.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:26Z|00201|binding|INFO|Releasing lport 06db452d-91a0-4ebb-b584-a57953634a03 from this chassis (sb_readonly=0)
Jan 22 17:25:26 np0005592767 nova_compute[182623]: 2026-01-22 22:25:26.298 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:29 np0005592767 nova_compute[182623]: 2026-01-22 22:25:29.902 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:30 np0005592767 nova_compute[182623]: 2026-01-22 22:25:30.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:31 np0005592767 podman[218956]: 2026-01-22 22:25:31.168261453 +0000 UTC m=+0.072560741 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:25:34 np0005592767 nova_compute[182623]: 2026-01-22 22:25:34.907 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:35 np0005592767 nova_compute[182623]: 2026-01-22 22:25:35.193 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:35Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b5:fa:79 10.100.0.6
Jan 22 17:25:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:35Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:fa:79 10.100.0.6
Jan 22 17:25:36 np0005592767 nova_compute[182623]: 2026-01-22 22:25:36.056 182627 DEBUG nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:25:36 np0005592767 podman[218990]: 2026-01-22 22:25:36.182255224 +0000 UTC m=+0.083946132 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 17:25:36 np0005592767 podman[218989]: 2026-01-22 22:25:36.230124496 +0000 UTC m=+0.127307836 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:25:38 np0005592767 kernel: tap884ca156-0d (unregistering): left promiscuous mode
Jan 22 17:25:38 np0005592767 NetworkManager[54973]: <info>  [1769120738.2821] device (tap884ca156-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:25:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:38Z|00202|binding|INFO|Releasing lport 884ca156-0df5-4310-9173-b39b523419c2 from this chassis (sb_readonly=0)
Jan 22 17:25:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:38Z|00203|binding|INFO|Setting lport 884ca156-0df5-4310-9173-b39b523419c2 down in Southbound
Jan 22 17:25:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:25:38Z|00204|binding|INFO|Removing iface tap884ca156-0d ovn-installed in OVS
Jan 22 17:25:38 np0005592767 nova_compute[182623]: 2026-01-22 22:25:38.293 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.317 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:fa:79 10.100.0.6'], port_security=['fa:16:3e:b5:fa:79 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a892917e-4b80-4d70-9dc6-3b242345fa9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f8f780ce45a4950b1666a54cd9a5ba0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a79c2fee-55f0-401e-98f6-87e5b69fa040', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2948110-e4a4-4acd-8637-8446d002e78a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=884ca156-0df5-4310-9173-b39b523419c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.318 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 884ca156-0df5-4310-9173-b39b523419c2 in datapath 976277ea-61b2-4223-a8f7-3d46bf9c98ad unbound from our chassis#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.320 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 976277ea-61b2-4223-a8f7-3d46bf9c98ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:25:38 np0005592767 nova_compute[182623]: 2026-01-22 22:25:38.327 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.325 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[39c2ec53-2c9f-412c-b735-abf5d13740be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.326 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad namespace which is not needed anymore#033[00m
Jan 22 17:25:38 np0005592767 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 22 17:25:38 np0005592767 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Consumed 13.481s CPU time.
Jan 22 17:25:38 np0005592767 systemd-machined[153912]: Machine qemu-28-instance-00000038 terminated.
Jan 22 17:25:38 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [NOTICE]   (218943) : haproxy version is 2.8.14-c23fe91
Jan 22 17:25:38 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [NOTICE]   (218943) : path to executable is /usr/sbin/haproxy
Jan 22 17:25:38 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [WARNING]  (218943) : Exiting Master process...
Jan 22 17:25:38 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [ALERT]    (218943) : Current worker (218945) exited with code 143 (Terminated)
Jan 22 17:25:38 np0005592767 neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad[218939]: [WARNING]  (218943) : All workers exited. Exiting... (0)
Jan 22 17:25:38 np0005592767 systemd[1]: libpod-7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365.scope: Deactivated successfully.
Jan 22 17:25:38 np0005592767 podman[219060]: 2026-01-22 22:25:38.54126183 +0000 UTC m=+0.068913977 container died 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:25:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365-userdata-shm.mount: Deactivated successfully.
Jan 22 17:25:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay-16119f92750af278e74b3becc9e62de4d9b3da31a67ba6a1b3fca4528d612733-merged.mount: Deactivated successfully.
Jan 22 17:25:38 np0005592767 podman[219060]: 2026-01-22 22:25:38.595389378 +0000 UTC m=+0.123041525 container cleanup 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:25:38 np0005592767 systemd[1]: libpod-conmon-7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365.scope: Deactivated successfully.
Jan 22 17:25:38 np0005592767 podman[219102]: 2026-01-22 22:25:38.696111373 +0000 UTC m=+0.064533964 container remove 7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.705 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c7be5011-d458-42d4-b63e-f396d6ae257b]: (4, ('Thu Jan 22 10:25:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365)\n7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365\nThu Jan 22 10:25:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad (7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365)\n7e45cb58b9ef0b8fd74064b6a21cd206a0861d6e77b9b7db61ee6e4219bbd365\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.708 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab5624b-ba25-4d09-baac-d46e4f488a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.711 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap976277ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:38 np0005592767 nova_compute[182623]: 2026-01-22 22:25:38.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:38 np0005592767 kernel: tap976277ea-60: left promiscuous mode
Jan 22 17:25:38 np0005592767 nova_compute[182623]: 2026-01-22 22:25:38.731 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.737 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[39b6baca-f6a7-4c9f-a7bc-56968baf72a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.748 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03162592-db7f-49dc-8003-ff1761d01d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.750 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[be5c1e4e-83e4-4336-9ad9-14b541718236]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.768 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[80d33cec-6bd9-415d-a4b9-48a7b0bd3389]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431810, 'reachable_time': 31862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219120, 'error': None, 'target': 'ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:38 np0005592767 systemd[1]: run-netns-ovnmeta\x2d976277ea\x2d61b2\x2d4223\x2da8f7\x2d3d46bf9c98ad.mount: Deactivated successfully.
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.779 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-976277ea-61b2-4223-a8f7-3d46bf9c98ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:25:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:38.781 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[696f27c3-cce0-44e7-82cb-1c6380c5afaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.072 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.078 182627 INFO nova.virt.libvirt.driver [-] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance destroyed successfully.#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.079 182627 DEBUG nova.objects.instance [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'numa_topology' on Instance uuid a892917e-4b80-4d70-9dc6-3b242345fa9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.121 182627 DEBUG nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-vif-unplugged-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.121 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.121 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.122 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.122 182627 DEBUG nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] No waiting events found dispatching network-vif-unplugged-884ca156-0df5-4310-9173-b39b523419c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.122 182627 WARNING nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received unexpected event network-vif-unplugged-884ca156-0df5-4310-9173-b39b523419c2 for instance with vm_state active and task_state shelving.#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.122 182627 DEBUG nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.123 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.123 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.123 182627 DEBUG oslo_concurrency.lockutils [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.123 182627 DEBUG nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] No waiting events found dispatching network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.124 182627 WARNING nova.compute.manager [req-e1af58f7-871f-4bfc-8e49-d58c1c4da96d req-d9a1447d-9c36-4fc8-9f83-6913b292cc5a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received unexpected event network-vif-plugged-884ca156-0df5-4310-9173-b39b523419c2 for instance with vm_state active and task_state shelving.#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.650 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Beginning cold snapshot process#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.858 182627 DEBUG nova.privsep.utils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.858 182627 DEBUG oslo_concurrency.processutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk /var/lib/nova/instances/snapshots/tmp5zg47nrf/57ca008bf80e4b95804899526d19a36a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:25:39 np0005592767 nova_compute[182623]: 2026-01-22 22:25:39.908 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:40 np0005592767 nova_compute[182623]: 2026-01-22 22:25:40.233 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:40 np0005592767 nova_compute[182623]: 2026-01-22 22:25:40.307 182627 DEBUG oslo_concurrency.processutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c/disk /var/lib/nova/instances/snapshots/tmp5zg47nrf/57ca008bf80e4b95804899526d19a36a" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:25:40 np0005592767 nova_compute[182623]: 2026-01-22 22:25:40.308 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:25:43 np0005592767 podman[219132]: 2026-01-22 22:25:43.168488628 +0000 UTC m=+0.073758443 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:25:43 np0005592767 podman[219131]: 2026-01-22 22:25:43.168143679 +0000 UTC m=+0.079661991 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.407 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Snapshot image upload complete#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.408 182627 DEBUG nova.compute.manager [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.528 182627 INFO nova.compute.manager [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Shelve offloading#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.547 182627 INFO nova.virt.libvirt.driver [-] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance destroyed successfully.#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.547 182627 DEBUG nova.compute.manager [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.550 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.551 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquired lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.551 182627 DEBUG nova.network.neutron [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:25:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:44.730 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.730 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:44.731 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:25:44 np0005592767 nova_compute[182623]: 2026-01-22 22:25:44.911 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:45 np0005592767 nova_compute[182623]: 2026-01-22 22:25:45.237 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:47 np0005592767 nova_compute[182623]: 2026-01-22 22:25:47.505 182627 DEBUG nova.network.neutron [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updating instance_info_cache with network_info: [{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:25:47 np0005592767 nova_compute[182623]: 2026-01-22 22:25:47.528 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Releasing lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.644 182627 INFO nova.virt.libvirt.driver [-] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Instance destroyed successfully.#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.645 182627 DEBUG nova.objects.instance [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lazy-loading 'resources' on Instance uuid a892917e-4b80-4d70-9dc6-3b242345fa9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.658 182627 DEBUG nova.virt.libvirt.vif [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:25:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018382080',display_name='tempest-DeleteServersTestJSON-server-1018382080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018382080',id=56,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:25:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9f8f780ce45a4950b1666a54cd9a5ba0',ramdisk_id='',reservation_id='r-rrpcot2p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1655437746',owner_user_name='tempest-DeleteServersTestJSON-1655437746-project-member',shelved_at='2026-01-22T22:25:44.408856',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='d8e7e821-35ab-4da5-bdec-40d4339b2688'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:25:40Z,user_data=None,user_id='95cf9999380d48108a561554c1897f15',uuid=a892917e-4b80-4d70-9dc6-3b242345fa9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.659 182627 DEBUG nova.network.os_vif_util [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converting VIF {"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap884ca156-0d", "ovs_interfaceid": "884ca156-0df5-4310-9173-b39b523419c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.660 182627 DEBUG nova.network.os_vif_util [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.660 182627 DEBUG os_vif [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.662 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.662 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap884ca156-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.709 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.712 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.714 182627 INFO os_vif [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:fa:79,bridge_name='br-int',has_traffic_filtering=True,id=884ca156-0df5-4310-9173-b39b523419c2,network=Network(976277ea-61b2-4223-a8f7-3d46bf9c98ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap884ca156-0d')#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.715 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Deleting instance files /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c_del#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.722 182627 INFO nova.virt.libvirt.driver [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Deletion of /var/lib/nova/instances/a892917e-4b80-4d70-9dc6-3b242345fa9c_del complete#033[00m
Jan 22 17:25:48 np0005592767 nova_compute[182623]: 2026-01-22 22:25:48.938 182627 INFO nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Deleted allocations for instance a892917e-4b80-4d70-9dc6-3b242345fa9c#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.022 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.022 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.063 182627 DEBUG nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.111 182627 DEBUG nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.112 182627 DEBUG nova.compute.provider_tree [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.129 182627 DEBUG nova.compute.manager [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Received event network-changed-884ca156-0df5-4310-9173-b39b523419c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.130 182627 DEBUG nova.compute.manager [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Refreshing instance network info cache due to event network-changed-884ca156-0df5-4310-9173-b39b523419c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.130 182627 DEBUG oslo_concurrency.lockutils [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.130 182627 DEBUG oslo_concurrency.lockutils [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.131 182627 DEBUG nova.network.neutron [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Refreshing network info cache for port 884ca156-0df5-4310-9173-b39b523419c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.153 182627 DEBUG nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.195 182627 DEBUG nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.232 182627 DEBUG nova.compute.provider_tree [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.247 182627 DEBUG nova.scheduler.client.report [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.278 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:49 np0005592767 nova_compute[182623]: 2026-01-22 22:25:49.395 182627 DEBUG oslo_concurrency.lockutils [None req-3bf88356-bfc5-4c3e-942b-ef7c5c9921c4 95cf9999380d48108a561554c1897f15 9f8f780ce45a4950b1666a54cd9a5ba0 - - default default] Lock "a892917e-4b80-4d70-9dc6-3b242345fa9c" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 23.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:25:50 np0005592767 podman[219174]: 2026-01-22 22:25:50.117507914 +0000 UTC m=+0.042868862 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:25:50 np0005592767 nova_compute[182623]: 2026-01-22 22:25:50.310 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:25:50.734 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:25:52 np0005592767 nova_compute[182623]: 2026-01-22 22:25:52.897 182627 DEBUG nova.network.neutron [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updated VIF entry in instance network info cache for port 884ca156-0df5-4310-9173-b39b523419c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:25:52 np0005592767 nova_compute[182623]: 2026-01-22 22:25:52.898 182627 DEBUG nova.network.neutron [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Updating instance_info_cache with network_info: [{"id": "884ca156-0df5-4310-9173-b39b523419c2", "address": "fa:16:3e:b5:fa:79", "network": {"id": "976277ea-61b2-4223-a8f7-3d46bf9c98ad", "bridge": null, "label": "tempest-DeleteServersTestJSON-311555490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f8f780ce45a4950b1666a54cd9a5ba0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap884ca156-0d", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:25:52 np0005592767 nova_compute[182623]: 2026-01-22 22:25:52.925 182627 DEBUG oslo_concurrency.lockutils [req-ad2ac347-15c1-417b-b96b-d0ab63b6cb11 req-e91bb2f9-6b7a-4977-a6d2-210f6f03c292 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a892917e-4b80-4d70-9dc6-3b242345fa9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:25:53 np0005592767 nova_compute[182623]: 2026-01-22 22:25:53.577 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120738.575416, a892917e-4b80-4d70-9dc6-3b242345fa9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:25:53 np0005592767 nova_compute[182623]: 2026-01-22 22:25:53.577 182627 INFO nova.compute.manager [-] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:25:53 np0005592767 nova_compute[182623]: 2026-01-22 22:25:53.641 182627 DEBUG nova.compute.manager [None req-9c39509d-fcd7-440a-8315-eca7dc2655bd - - - - - -] [instance: a892917e-4b80-4d70-9dc6-3b242345fa9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:25:53 np0005592767 nova_compute[182623]: 2026-01-22 22:25:53.710 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:55 np0005592767 nova_compute[182623]: 2026-01-22 22:25:55.315 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:25:58 np0005592767 nova_compute[182623]: 2026-01-22 22:25:58.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:00 np0005592767 nova_compute[182623]: 2026-01-22 22:26:00.317 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:02 np0005592767 podman[219199]: 2026-01-22 22:26:02.133965069 +0000 UTC m=+0.059002138 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true)
Jan 22 17:26:03 np0005592767 nova_compute[182623]: 2026-01-22 22:26:03.718 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:05 np0005592767 nova_compute[182623]: 2026-01-22 22:26:05.350 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:07 np0005592767 podman[219220]: 2026-01-22 22:26:07.177744601 +0000 UTC m=+0.080407442 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Jan 22 17:26:07 np0005592767 podman[219219]: 2026-01-22 22:26:07.206695048 +0000 UTC m=+0.113583068 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:26:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:26:07 np0005592767 nova_compute[182623]: 2026-01-22 22:26:07.789 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:07 np0005592767 nova_compute[182623]: 2026-01-22 22:26:07.790 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:07 np0005592767 nova_compute[182623]: 2026-01-22 22:26:07.862 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.015 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.016 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.026 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.027 182627 INFO nova.compute.claims [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.164 182627 DEBUG nova.compute.provider_tree [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.194 182627 DEBUG nova.scheduler.client.report [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.219 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.220 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.297 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.298 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.334 182627 INFO nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.357 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.554 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.556 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.557 182627 INFO nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Creating image(s)#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.558 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.559 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.560 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.592 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.673 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.674 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.674 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.684 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.704 182627 DEBUG nova.policy [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2e521f56216f40fd986489aada143152', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4e421ef5ced94104b2eb81cb88740956', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.776 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.802 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.804 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.835 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.836 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.837 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.887 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.888 182627 DEBUG nova.virt.disk.api [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Checking if we can resize image /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.888 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.937 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.938 182627 DEBUG nova.virt.disk.api [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Cannot resize image /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.939 182627 DEBUG nova.objects.instance [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lazy-loading 'migration_context' on Instance uuid f619e46f-8faf-4be7-bf91-55c442ffb031 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.963 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.964 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Ensure instance console log exists: /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.964 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.964 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:08 np0005592767 nova_compute[182623]: 2026-01-22 22:26:08.965 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:10 np0005592767 nova_compute[182623]: 2026-01-22 22:26:10.352 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:10 np0005592767 nova_compute[182623]: 2026-01-22 22:26:10.643 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Successfully created port: d3907878-1a82-42e9-b0b0-ed0b768e7e68 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:26:11 np0005592767 nova_compute[182623]: 2026-01-22 22:26:11.895 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:11 np0005592767 nova_compute[182623]: 2026-01-22 22:26:11.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:11 np0005592767 nova_compute[182623]: 2026-01-22 22:26:11.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:26:11 np0005592767 nova_compute[182623]: 2026-01-22 22:26:11.992 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Successfully updated port: d3907878-1a82-42e9-b0b0-ed0b768e7e68 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:26:12 np0005592767 nova_compute[182623]: 2026-01-22 22:26:12.020 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:12 np0005592767 nova_compute[182623]: 2026-01-22 22:26:12.020 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquired lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:12 np0005592767 nova_compute[182623]: 2026-01-22 22:26:12.021 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:26:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:12.097 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:12.098 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:12.098 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:12 np0005592767 nova_compute[182623]: 2026-01-22 22:26:12.271 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:26:12 np0005592767 nova_compute[182623]: 2026-01-22 22:26:12.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.589 182627 DEBUG nova.network.neutron [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Updating instance_info_cache with network_info: [{"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.626 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Releasing lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.627 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Instance network_info: |[{"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.631 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Start _get_guest_xml network_info=[{"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.637 182627 WARNING nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.643 182627 DEBUG nova.virt.libvirt.host [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.644 182627 DEBUG nova.virt.libvirt.host [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.648 182627 DEBUG nova.virt.libvirt.host [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.649 182627 DEBUG nova.virt.libvirt.host [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.650 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.650 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.651 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.651 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.651 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.651 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.651 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.652 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.652 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.652 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.652 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.653 182627 DEBUG nova.virt.hardware [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.657 182627 DEBUG nova.virt.libvirt.vif [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-119284419',display_name='tempest-ImagesOneServerTestJSON-server-119284419',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-119284419',id=60,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4e421ef5ced94104b2eb81cb88740956',ramdisk_id='',reservation_id='r-dw6w16jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-411724568',owner_user_name='tempest-ImagesOneServerTestJSON-411724568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:08Z,user_data=None,user_id='2e521f56216f40fd986489aada143152',uuid=f619e46f-8faf-4be7-bf91-55c442ffb031,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.657 182627 DEBUG nova.network.os_vif_util [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converting VIF {"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.658 182627 DEBUG nova.network.os_vif_util [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.658 182627 DEBUG nova.objects.instance [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lazy-loading 'pci_devices' on Instance uuid f619e46f-8faf-4be7-bf91-55c442ffb031 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.674 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <uuid>f619e46f-8faf-4be7-bf91-55c442ffb031</uuid>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <name>instance-0000003c</name>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:name>tempest-ImagesOneServerTestJSON-server-119284419</nova:name>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:26:13</nova:creationTime>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:user uuid="2e521f56216f40fd986489aada143152">tempest-ImagesOneServerTestJSON-411724568-project-member</nova:user>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:project uuid="4e421ef5ced94104b2eb81cb88740956">tempest-ImagesOneServerTestJSON-411724568</nova:project>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        <nova:port uuid="d3907878-1a82-42e9-b0b0-ed0b768e7e68">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="serial">f619e46f-8faf-4be7-bf91-55c442ffb031</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="uuid">f619e46f-8faf-4be7-bf91-55c442ffb031</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.config"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:9f:36:b0"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <target dev="tapd3907878-1a"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/console.log" append="off"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:26:13 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:26:13 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:26:13 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:26:13 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.675 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Preparing to wait for external event network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.675 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.675 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.675 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.676 182627 DEBUG nova.virt.libvirt.vif [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-119284419',display_name='tempest-ImagesOneServerTestJSON-server-119284419',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-119284419',id=60,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4e421ef5ced94104b2eb81cb88740956',ramdisk_id='',reservation_id='r-dw6w16jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-411724568',owner_user_name='tempest-ImagesOneServerTestJSON-411724568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:08Z,user_data=None,user_id='2e521f56216f40fd986489aada143152',uuid=f619e46f-8faf-4be7-bf91-55c442ffb031,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.676 182627 DEBUG nova.network.os_vif_util [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converting VIF {"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.677 182627 DEBUG nova.network.os_vif_util [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.677 182627 DEBUG os_vif [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.678 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.678 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.678 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.682 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.683 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3907878-1a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.684 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3907878-1a, col_values=(('external_ids', {'iface-id': 'd3907878-1a82-42e9-b0b0-ed0b768e7e68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:36:b0', 'vm-uuid': 'f619e46f-8faf-4be7-bf91-55c442ffb031'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.686 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:13 np0005592767 NetworkManager[54973]: <info>  [1769120773.6885] manager: (tapd3907878-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.690 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.695 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.696 182627 INFO os_vif [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a')#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.800 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.801 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.801 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] No VIF found with MAC fa:16:3e:9f:36:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.802 182627 INFO nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Using config drive#033[00m
Jan 22 17:26:13 np0005592767 podman[219283]: 2026-01-22 22:26:13.806428338 +0000 UTC m=+0.066816887 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:26:13 np0005592767 podman[219282]: 2026-01-22 22:26:13.818411287 +0000 UTC m=+0.078678273 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.895 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:13 np0005592767 nova_compute[182623]: 2026-01-22 22:26:13.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.034 182627 DEBUG nova.compute.manager [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Received event network-changed-d3907878-1a82-42e9-b0b0-ed0b768e7e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.035 182627 DEBUG nova.compute.manager [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Refreshing instance network info cache due to event network-changed-d3907878-1a82-42e9-b0b0-ed0b768e7e68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.035 182627 DEBUG oslo_concurrency.lockutils [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.035 182627 DEBUG oslo_concurrency.lockutils [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.035 182627 DEBUG nova.network.neutron [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Refreshing network info cache for port d3907878-1a82-42e9-b0b0-ed0b768e7e68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.305 182627 INFO nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Creating config drive at /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.config#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.316 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkj297u43 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.445 182627 DEBUG oslo_concurrency.processutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkj297u43" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:14 np0005592767 kernel: tapd3907878-1a: entered promiscuous mode
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.523 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:14Z|00205|binding|INFO|Claiming lport d3907878-1a82-42e9-b0b0-ed0b768e7e68 for this chassis.
Jan 22 17:26:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:14Z|00206|binding|INFO|d3907878-1a82-42e9-b0b0-ed0b768e7e68: Claiming fa:16:3e:9f:36:b0 10.100.0.4
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.5271] manager: (tapd3907878-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.528 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.552 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:36:b0 10.100.0.4'], port_security=['fa:16:3e:9f:36:b0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f619e46f-8faf-4be7-bf91-55c442ffb031', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e421ef5ced94104b2eb81cb88740956', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b787468c-31bd-4ea3-a82f-2733be2a5b09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28adf37a-ced0-49be-897a-97e8d81fa9a8, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d3907878-1a82-42e9-b0b0-ed0b768e7e68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.553 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d3907878-1a82-42e9-b0b0-ed0b768e7e68 in datapath 3f321120-f6a6-4fa1-8b1b-f23890fe2465 bound to our chassis#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.554 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f321120-f6a6-4fa1-8b1b-f23890fe2465#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.566 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c264248c-9834-44c9-a419-3e17876f3997]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.567 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f321120-f1 in ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.569 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f321120-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.569 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fefe0063-09e1-428b-8105-53f6fd615126]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.571 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a07e3367-b4c7-4814-bcf8-7e85fae97563]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 systemd-machined[153912]: New machine qemu-29-instance-0000003c.
Jan 22 17:26:14 np0005592767 systemd-udevd[219341]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.577 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:14Z|00207|binding|INFO|Setting lport d3907878-1a82-42e9-b0b0-ed0b768e7e68 ovn-installed in OVS
Jan 22 17:26:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:14Z|00208|binding|INFO|Setting lport d3907878-1a82-42e9-b0b0-ed0b768e7e68 up in Southbound
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.583 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.5860] device (tapd3907878-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:26:14 np0005592767 systemd[1]: Started Virtual Machine qemu-29-instance-0000003c.
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.585 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3d86805a-8e82-47c7-a441-813af71e06f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.5885] device (tapd3907878-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.599 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c014599-2796-42f7-9096-7cfd2880d69c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.626 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fb5a23-32f5-42a7-b6e2-27b8104a6aff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.632 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[81634381-3cc1-4df2-8b57-89583fabaacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.6334] manager: (tap3f321120-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.659 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d1ea63-bb74-409f-a697-71aabf3293e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.662 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d04d74ed-13f6-4b10-9ca7-5ce2e57e467a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.6820] device (tap3f321120-f0): carrier: link connected
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.686 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[887670cb-02a5-43c1-adfb-c52854fc2ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.705 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e448899c-1e58-4577-a129-0e378ebb134f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f321120-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:d1:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437126, 'reachable_time': 25581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219373, 'error': None, 'target': 'ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.719 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5f13c937-f5d2-4dbf-a505-056925dea04c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:d194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 437126, 'tstamp': 437126}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219374, 'error': None, 'target': 'ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.733 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7367f4f5-6a51-43d3-99ba-1906fc40234c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f321120-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:d1:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437126, 'reachable_time': 25581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219377, 'error': None, 'target': 'ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.757 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1f718b-bb14-42cb-a2ec-c5a0d54d627f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.815 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[81bfb2ba-f193-4cbc-9ff1-572984986759]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.816 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f321120-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.817 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.816 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120774.816436, f619e46f-8faf-4be7-bf91-55c442ffb031 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.817 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f321120-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.817 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] VM Started (Lifecycle Event)#033[00m
Jan 22 17:26:14 np0005592767 NetworkManager[54973]: <info>  [1769120774.8207] manager: (tap3f321120-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 22 17:26:14 np0005592767 kernel: tap3f321120-f0: entered promiscuous mode
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.821 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.823 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.823 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f321120-f0, col_values=(('external_ids', {'iface-id': '010c5a58-baaf-41e8-a2d3-45c97a15391f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:14Z|00209|binding|INFO|Releasing lport 010c5a58-baaf-41e8-a2d3-45c97a15391f from this chassis (sb_readonly=0)
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.840 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.842 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.843 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f321120-f6a6-4fa1-8b1b-f23890fe2465.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f321120-f6a6-4fa1-8b1b-f23890fe2465.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.844 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cabb89-ed0b-4182-9c84-d756bda5caeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.845 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-3f321120-f6a6-4fa1-8b1b-f23890fe2465
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/3f321120-f6a6-4fa1-8b1b-f23890fe2465.pid.haproxy
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 3f321120-f6a6-4fa1-8b1b-f23890fe2465
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:26:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:14.847 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'env', 'PROCESS_TAG=haproxy-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f321120-f6a6-4fa1-8b1b-f23890fe2465.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.855 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.858 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120774.8174963, f619e46f-8faf-4be7-bf91-55c442ffb031 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.859 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.891 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.895 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:26:14 np0005592767 nova_compute[182623]: 2026-01-22 22:26:14.914 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:26:15 np0005592767 podman[219414]: 2026-01-22 22:26:15.183734683 +0000 UTC m=+0.042812971 container create e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:26:15 np0005592767 systemd[1]: Started libpod-conmon-e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50.scope.
Jan 22 17:26:15 np0005592767 podman[219414]: 2026-01-22 22:26:15.161913146 +0000 UTC m=+0.020991464 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:26:15 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:26:15 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58491333ef5c33e83020f82f0d5cd810448b3c56cd06c1de5b8a835442140fe6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:26:15 np0005592767 podman[219414]: 2026-01-22 22:26:15.285483376 +0000 UTC m=+0.144561754 container init e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:26:15 np0005592767 podman[219414]: 2026-01-22 22:26:15.294784188 +0000 UTC m=+0.153862506 container start e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:26:15 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [NOTICE]   (219432) : New worker (219434) forked
Jan 22 17:26:15 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [NOTICE]   (219432) : Loading success.
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.354 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.929 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.930 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.930 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:15 np0005592767 nova_compute[182623]: 2026-01-22 22:26:15.931 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.037 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.111 182627 DEBUG nova.network.neutron [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Updated VIF entry in instance network info cache for port d3907878-1a82-42e9-b0b0-ed0b768e7e68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.112 182627 DEBUG nova.network.neutron [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Updating instance_info_cache with network_info: [{"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.130 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.131 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.157 182627 DEBUG oslo_concurrency.lockutils [req-e7d8de35-b9a3-44b2-b810-c9231fd7179c req-aa34ad1f-090e-4124-b62b-d5979c76c072 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f619e46f-8faf-4be7-bf91-55c442ffb031" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.159 182627 DEBUG nova.compute.manager [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Received event network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.160 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.160 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.160 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.161 182627 DEBUG nova.compute.manager [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Processing event network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.161 182627 DEBUG nova.compute.manager [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Received event network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.161 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.161 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.162 182627 DEBUG oslo_concurrency.lockutils [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.162 182627 DEBUG nova.compute.manager [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] No waiting events found dispatching network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.162 182627 WARNING nova.compute.manager [req-33643f3a-4aa2-4de8-b0f3-87162fc3cfaf req-8321700c-3876-493a-b279-8c7bcf429c10 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Received unexpected event network-vif-plugged-d3907878-1a82-42e9-b0b0-ed0b768e7e68 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.163 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.167 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120776.1667984, f619e46f-8faf-4be7-bf91-55c442ffb031 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.167 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.168 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.172 182627 INFO nova.virt.libvirt.driver [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Instance spawned successfully.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.172 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.221 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.224 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.227 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.228 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.229 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.229 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.230 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.230 182627 DEBUG nova.virt.libvirt.driver [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.235 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.287 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.347 182627 INFO nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 7.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.347 182627 DEBUG nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.426 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.431 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5658MB free_disk=73.23501968383789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.432 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.432 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.455 182627 INFO nova.compute.manager [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 8.47 seconds to build instance.#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.506 182627 DEBUG oslo_concurrency.lockutils [None req-ed6bc7ca-4d91-4e12-af4b-385f8e59659d 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.544 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance f619e46f-8faf-4be7-bf91-55c442ffb031 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.544 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.545 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.633 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.721 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.751 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:26:16 np0005592767 nova_compute[182623]: 2026-01-22 22:26:16.751 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:18 np0005592767 nova_compute[182623]: 2026-01-22 22:26:18.686 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:18 np0005592767 nova_compute[182623]: 2026-01-22 22:26:18.870 182627 DEBUG nova.compute.manager [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:18 np0005592767 nova_compute[182623]: 2026-01-22 22:26:18.994 182627 INFO nova.compute.manager [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] instance snapshotting#033[00m
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.544 182627 INFO nova.virt.libvirt.driver [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Beginning live snapshot process#033[00m
Jan 22 17:26:19 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.800 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.877 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.879 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.943 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:19 np0005592767 nova_compute[182623]: 2026-01-22 22:26:19.960 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.012 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.013 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.062 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea.delta 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.063 182627 INFO nova.virt.libvirt.driver [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.115 182627 DEBUG nova.virt.libvirt.guest [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.121 182627 INFO nova.virt.libvirt.driver [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.153 182627 DEBUG nova.privsep.utils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.154 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea.delta /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.347 182627 DEBUG oslo_concurrency.processutils [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea.delta /var/lib/nova/instances/snapshots/tmplzzdckhf/d1bd280a1b884f11a644f5b88f84feea" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.348 182627 INFO nova.virt.libvirt.driver [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:26:20 np0005592767 nova_compute[182623]: 2026-01-22 22:26:20.357 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:21 np0005592767 podman[219475]: 2026-01-22 22:26:21.152563928 +0000 UTC m=+0.066818018 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:26:23 np0005592767 nova_compute[182623]: 2026-01-22 22:26:23.729 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:23 np0005592767 nova_compute[182623]: 2026-01-22 22:26:23.909 182627 INFO nova.virt.libvirt.driver [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Snapshot image upload complete#033[00m
Jan 22 17:26:23 np0005592767 nova_compute[182623]: 2026-01-22 22:26:23.910 182627 INFO nova.compute.manager [None req-652e50ef-efea-4e79-95ec-8fd6452bdf18 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 4.90 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:26:25 np0005592767 nova_compute[182623]: 2026-01-22 22:26:25.361 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:28Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:36:b0 10.100.0.4
Jan 22 17:26:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:28Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:36:b0 10.100.0.4
Jan 22 17:26:28 np0005592767 nova_compute[182623]: 2026-01-22 22:26:28.734 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:28 np0005592767 nova_compute[182623]: 2026-01-22 22:26:28.765 182627 DEBUG nova.compute.manager [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:28 np0005592767 nova_compute[182623]: 2026-01-22 22:26:28.863 182627 INFO nova.compute.manager [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] instance snapshotting#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.245 182627 INFO nova.virt.libvirt.driver [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Beginning live snapshot process#033[00m
Jan 22 17:26:29 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.547 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.616 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.618 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.673 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.697 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.751 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.752 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.791 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.792 182627 INFO nova.virt.libvirt.driver [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:26:29 np0005592767 nova_compute[182623]: 2026-01-22 22:26:29.845 182627 DEBUG nova.virt.libvirt.guest [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.349 182627 DEBUG nova.virt.libvirt.guest [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.353 182627 INFO nova.virt.libvirt.driver [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.362 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.390 182627 DEBUG nova.privsep.utils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.391 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4.delta /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.807 182627 DEBUG oslo_concurrency.processutils [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4.delta /var/lib/nova/instances/snapshots/tmp2cgrauk6/b52215863adf4d2f94ef7bd99e064ab4" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:30 np0005592767 nova_compute[182623]: 2026-01-22 22:26:30.814 182627 INFO nova.virt.libvirt.driver [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:26:33 np0005592767 podman[219544]: 2026-01-22 22:26:33.158521577 +0000 UTC m=+0.077296434 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 22 17:26:33 np0005592767 nova_compute[182623]: 2026-01-22 22:26:33.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:34 np0005592767 nova_compute[182623]: 2026-01-22 22:26:34.092 182627 INFO nova.virt.libvirt.driver [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Snapshot image upload complete#033[00m
Jan 22 17:26:34 np0005592767 nova_compute[182623]: 2026-01-22 22:26:34.092 182627 INFO nova.compute.manager [None req-10af0ba1-e782-48e2-a158-543269124dee 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 5.20 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:26:35 np0005592767 nova_compute[182623]: 2026-01-22 22:26:35.393 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 podman[219565]: 2026-01-22 22:26:38.16796699 +0000 UTC m=+0.071104539 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:26:38 np0005592767 podman[219564]: 2026-01-22 22:26:38.214661028 +0000 UTC m=+0.116783688 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.373 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.374 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.375 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.375 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.376 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.390 182627 INFO nova.compute.manager [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Terminating instance#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.407 182627 DEBUG nova.compute.manager [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:26:38 np0005592767 kernel: tapd3907878-1a (unregistering): left promiscuous mode
Jan 22 17:26:38 np0005592767 NetworkManager[54973]: <info>  [1769120798.4318] device (tapd3907878-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:26:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:38Z|00210|binding|INFO|Releasing lport d3907878-1a82-42e9-b0b0-ed0b768e7e68 from this chassis (sb_readonly=0)
Jan 22 17:26:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:38Z|00211|binding|INFO|Setting lport d3907878-1a82-42e9-b0b0-ed0b768e7e68 down in Southbound
Jan 22 17:26:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:38Z|00212|binding|INFO|Removing iface tapd3907878-1a ovn-installed in OVS
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.445 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.452 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.458 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:36:b0 10.100.0.4'], port_security=['fa:16:3e:9f:36:b0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f619e46f-8faf-4be7-bf91-55c442ffb031', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e421ef5ced94104b2eb81cb88740956', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b787468c-31bd-4ea3-a82f-2733be2a5b09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28adf37a-ced0-49be-897a-97e8d81fa9a8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d3907878-1a82-42e9-b0b0-ed0b768e7e68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.460 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d3907878-1a82-42e9-b0b0-ed0b768e7e68 in datapath 3f321120-f6a6-4fa1-8b1b-f23890fe2465 unbound from our chassis#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.462 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f321120-f6a6-4fa1-8b1b-f23890fe2465, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.464 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[273a62f9-7c32-4d87-86a3-d3d4596d94dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.465 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465 namespace which is not needed anymore#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.475 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Jan 22 17:26:38 np0005592767 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003c.scope: Consumed 12.276s CPU time.
Jan 22 17:26:38 np0005592767 systemd-machined[153912]: Machine qemu-29-instance-0000003c terminated.
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [NOTICE]   (219432) : haproxy version is 2.8.14-c23fe91
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [NOTICE]   (219432) : path to executable is /usr/sbin/haproxy
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [WARNING]  (219432) : Exiting Master process...
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [WARNING]  (219432) : Exiting Master process...
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [ALERT]    (219432) : Current worker (219434) exited with code 143 (Terminated)
Jan 22 17:26:38 np0005592767 neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465[219428]: [WARNING]  (219432) : All workers exited. Exiting... (0)
Jan 22 17:26:38 np0005592767 systemd[1]: libpod-e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50.scope: Deactivated successfully.
Jan 22 17:26:38 np0005592767 podman[219634]: 2026-01-22 22:26:38.622215557 +0000 UTC m=+0.046211436 container died e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:26:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50-userdata-shm.mount: Deactivated successfully.
Jan 22 17:26:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay-58491333ef5c33e83020f82f0d5cd810448b3c56cd06c1de5b8a835442140fe6-merged.mount: Deactivated successfully.
Jan 22 17:26:38 np0005592767 podman[219634]: 2026-01-22 22:26:38.665136619 +0000 UTC m=+0.089132498 container cleanup e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.668 182627 INFO nova.virt.libvirt.driver [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Instance destroyed successfully.#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.669 182627 DEBUG nova.objects.instance [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lazy-loading 'resources' on Instance uuid f619e46f-8faf-4be7-bf91-55c442ffb031 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:26:38 np0005592767 systemd[1]: libpod-conmon-e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50.scope: Deactivated successfully.
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.688 182627 DEBUG nova.virt.libvirt.vif [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-119284419',display_name='tempest-ImagesOneServerTestJSON-server-119284419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-119284419',id=60,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:26:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4e421ef5ced94104b2eb81cb88740956',ramdisk_id='',reservation_id='r-dw6w16jy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-411724568',owner_user_name='tempest-ImagesOneServerTestJSON-411724568-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:26:34Z,user_data=None,user_id='2e521f56216f40fd986489aada143152',uuid=f619e46f-8faf-4be7-bf91-55c442ffb031,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.689 182627 DEBUG nova.network.os_vif_util [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converting VIF {"id": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "address": "fa:16:3e:9f:36:b0", "network": {"id": "3f321120-f6a6-4fa1-8b1b-f23890fe2465", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-2052848681-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e421ef5ced94104b2eb81cb88740956", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3907878-1a", "ovs_interfaceid": "d3907878-1a82-42e9-b0b0-ed0b768e7e68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.692 182627 DEBUG nova.network.os_vif_util [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.693 182627 DEBUG os_vif [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.698 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.698 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3907878-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.699 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.701 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.703 182627 INFO os_vif [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:36:b0,bridge_name='br-int',has_traffic_filtering=True,id=d3907878-1a82-42e9-b0b0-ed0b768e7e68,network=Network(3f321120-f6a6-4fa1-8b1b-f23890fe2465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3907878-1a')#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.703 182627 INFO nova.virt.libvirt.driver [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Deleting instance files /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031_del#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.704 182627 INFO nova.virt.libvirt.driver [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Deletion of /var/lib/nova/instances/f619e46f-8faf-4be7-bf91-55c442ffb031_del complete#033[00m
Jan 22 17:26:38 np0005592767 podman[219680]: 2026-01-22 22:26:38.73138874 +0000 UTC m=+0.043505680 container remove e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.738 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b266af-9932-4d2d-b115-5dd394a53808]: (4, ('Thu Jan 22 10:26:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465 (e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50)\ne57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50\nThu Jan 22 10:26:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465 (e57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50)\ne57075bba745db6567639b4ad9fce66c66316eea0b5bf0a658b55a2426656f50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.740 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dd45b2fd-a1f0-4723-88a8-ea8cb2a5f117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.741 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f321120-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.743 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 kernel: tap3f321120-f0: left promiscuous mode
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.755 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.759 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[81b46ff8-01ba-4e95-ac29-8651dcdb4f29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2226fc-cf07-468a-83aa-993ab9da89a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.775 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b34858-c566-4dd2-bf7a-63ef4e8e8b80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.791 182627 INFO nova.compute.manager [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.791 182627 DEBUG oslo.service.loopingcall [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.792 182627 DEBUG nova.compute.manager [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:26:38 np0005592767 nova_compute[182623]: 2026-01-22 22:26:38.792 182627 DEBUG nova.network.neutron [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.793 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[11c477e0-6b7f-4d4d-bcf6-a3a6ae60d05b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437120, 'reachable_time': 37393, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219692, 'error': None, 'target': 'ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:38 np0005592767 systemd[1]: run-netns-ovnmeta\x2d3f321120\x2df6a6\x2d4fa1\x2d8b1b\x2df23890fe2465.mount: Deactivated successfully.
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.799 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f321120-f6a6-4fa1-8b1b-f23890fe2465 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:26:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:38.800 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f21fe99f-0f35-404e-bdd2-51c1ebcbc053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.185 182627 DEBUG nova.network.neutron [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.216 182627 INFO nova.compute.manager [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Took 1.42 seconds to deallocate network for instance.#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.302 182627 DEBUG nova.compute.manager [req-f867e667-17c5-4655-b407-98576aa5079d req-550d9d45-e22b-4caa-92a6-4f73e2d311ea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Received event network-vif-deleted-d3907878-1a82-42e9-b0b0-ed0b768e7e68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.323 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.323 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.395 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.432 182627 DEBUG nova.compute.provider_tree [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.585 182627 DEBUG nova.scheduler.client.report [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.602 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.649 182627 INFO nova.scheduler.client.report [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Deleted allocations for instance f619e46f-8faf-4be7-bf91-55c442ffb031#033[00m
Jan 22 17:26:40 np0005592767 nova_compute[182623]: 2026-01-22 22:26:40.724 182627 DEBUG oslo_concurrency.lockutils [None req-9ee6f533-ce46-4c66-8e6a-9ca6289b8535 2e521f56216f40fd986489aada143152 4e421ef5ced94104b2eb81cb88740956 - - default default] Lock "f619e46f-8faf-4be7-bf91-55c442ffb031" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.507 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.508 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.530 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.652 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.653 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.661 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.662 182627 INFO nova.compute.claims [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.702 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.823 182627 DEBUG nova.compute.provider_tree [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.854 182627 DEBUG nova.scheduler.client.report [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.885 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.886 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.958 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.959 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:26:43 np0005592767 nova_compute[182623]: 2026-01-22 22:26:43.979 182627 INFO nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.022 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:26:44 np0005592767 podman[219693]: 2026-01-22 22:26:44.129095108 +0000 UTC m=+0.049077957 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:26:44 np0005592767 podman[219694]: 2026-01-22 22:26:44.157225562 +0000 UTC m=+0.064936825 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.160 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.161 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.162 182627 INFO nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Creating image(s)#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.162 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.162 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.163 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.177 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.248 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.250 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.250 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.263 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.303 182627 DEBUG nova.policy [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e7ddb71f6cbf4fc3bfbaf99b01271ec0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '059e811e196b4d02b1144af991a7abeb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.316 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.317 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.362 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.364 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.365 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.417 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.420 182627 DEBUG nova.virt.disk.api [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Checking if we can resize image /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.420 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.489 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.490 182627 DEBUG nova.virt.disk.api [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Cannot resize image /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.491 182627 DEBUG nova.objects.instance [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'migration_context' on Instance uuid 12c7660a-27b8-417e-be1f-cccf937421a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.519 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.519 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Ensure instance console log exists: /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.520 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.520 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:44 np0005592767 nova_compute[182623]: 2026-01-22 22:26:44.521 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:45 np0005592767 nova_compute[182623]: 2026-01-22 22:26:45.371 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Successfully created port: 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:26:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:45.379 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:26:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:45.380 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:26:45 np0005592767 nova_compute[182623]: 2026-01-22 22:26:45.380 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:45.381 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:45 np0005592767 nova_compute[182623]: 2026-01-22 22:26:45.397 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:45 np0005592767 nova_compute[182623]: 2026-01-22 22:26:45.652 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:47 np0005592767 nova_compute[182623]: 2026-01-22 22:26:47.908 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Successfully updated port: 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:26:47 np0005592767 nova_compute[182623]: 2026-01-22 22:26:47.926 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:47 np0005592767 nova_compute[182623]: 2026-01-22 22:26:47.926 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquired lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:47 np0005592767 nova_compute[182623]: 2026-01-22 22:26:47.926 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:26:48 np0005592767 nova_compute[182623]: 2026-01-22 22:26:48.576 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:26:48 np0005592767 nova_compute[182623]: 2026-01-22 22:26:48.705 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:48 np0005592767 nova_compute[182623]: 2026-01-22 22:26:48.968 182627 DEBUG nova.compute.manager [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:48 np0005592767 nova_compute[182623]: 2026-01-22 22:26:48.968 182627 DEBUG nova.compute.manager [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing instance network info cache due to event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:26:48 np0005592767 nova_compute[182623]: 2026-01-22 22:26:48.969 182627 DEBUG oslo_concurrency.lockutils [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.766 182627 DEBUG nova.network.neutron [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.808 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Releasing lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.809 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Instance network_info: |[{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.809 182627 DEBUG oslo_concurrency.lockutils [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.810 182627 DEBUG nova.network.neutron [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.816 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Start _get_guest_xml network_info=[{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.822 182627 WARNING nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.834 182627 DEBUG nova.virt.libvirt.host [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.835 182627 DEBUG nova.virt.libvirt.host [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.848 182627 DEBUG nova.virt.libvirt.host [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.849 182627 DEBUG nova.virt.libvirt.host [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.851 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.852 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.853 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.853 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.854 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.854 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.855 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.855 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.856 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.856 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.856 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.857 182627 DEBUG nova.virt.hardware [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.863 182627 DEBUG nova.virt.libvirt.vif [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2019090657',display_name='tempest-SecurityGroupsTestJSON-server-2019090657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2019090657',id=62,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-4fd99k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:44Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=12c7660a-27b8-417e-be1f-cccf937421a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.864 182627 DEBUG nova.network.os_vif_util [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.865 182627 DEBUG nova.network.os_vif_util [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.866 182627 DEBUG nova.objects.instance [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'pci_devices' on Instance uuid 12c7660a-27b8-417e-be1f-cccf937421a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.909 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <uuid>12c7660a-27b8-417e-be1f-cccf937421a2</uuid>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <name>instance-0000003e</name>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:name>tempest-SecurityGroupsTestJSON-server-2019090657</nova:name>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:26:49</nova:creationTime>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:user uuid="e7ddb71f6cbf4fc3bfbaf99b01271ec0">tempest-SecurityGroupsTestJSON-999807366-project-member</nova:user>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:project uuid="059e811e196b4d02b1144af991a7abeb">tempest-SecurityGroupsTestJSON-999807366</nova:project>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        <nova:port uuid="5524effc-bcff-46ea-90a6-d4ec2eb3b8ae">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="serial">12c7660a-27b8-417e-be1f-cccf937421a2</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="uuid">12c7660a-27b8-417e-be1f-cccf937421a2</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.config"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:25:ac:f4"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <target dev="tap5524effc-bc"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/console.log" append="off"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:26:49 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:26:49 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:26:49 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:26:49 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.911 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Preparing to wait for external event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.912 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.912 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.913 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.914 182627 DEBUG nova.virt.libvirt.vif [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2019090657',display_name='tempest-SecurityGroupsTestJSON-server-2019090657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2019090657',id=62,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-4fd99k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:26:44Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=12c7660a-27b8-417e-be1f-cccf937421a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.915 182627 DEBUG nova.network.os_vif_util [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.916 182627 DEBUG nova.network.os_vif_util [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.916 182627 DEBUG os_vif [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.917 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.918 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.919 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.923 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5524effc-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.924 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5524effc-bc, col_values=(('external_ids', {'iface-id': '5524effc-bcff-46ea-90a6-d4ec2eb3b8ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:ac:f4', 'vm-uuid': '12c7660a-27b8-417e-be1f-cccf937421a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:49 np0005592767 NetworkManager[54973]: <info>  [1769120809.9289] manager: (tap5524effc-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.935 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:49 np0005592767 nova_compute[182623]: 2026-01-22 22:26:49.936 182627 INFO os_vif [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc')#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.025 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.026 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.027 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No VIF found with MAC fa:16:3e:25:ac:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.028 182627 INFO nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Using config drive#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.399 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.798 182627 INFO nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Creating config drive at /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.config#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.803 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9rsikr84 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:26:50 np0005592767 nova_compute[182623]: 2026-01-22 22:26:50.940 182627 DEBUG oslo_concurrency.processutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9rsikr84" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:26:51 np0005592767 kernel: tap5524effc-bc: entered promiscuous mode
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.0173] manager: (tap5524effc-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Jan 22 17:26:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:51Z|00213|binding|INFO|Claiming lport 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae for this chassis.
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.017 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:51Z|00214|binding|INFO|5524effc-bcff-46ea-90a6-d4ec2eb3b8ae: Claiming fa:16:3e:25:ac:f4 10.100.0.12
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.038 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:ac:f4 10.100.0.12'], port_security=['fa:16:3e:25:ac:f4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '12c7660a-27b8-417e-be1f-cccf937421a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.040 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae in datapath 8f942dd2-c635-41b4-933f-433a748048f1 bound to our chassis#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.045 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f942dd2-c635-41b4-933f-433a748048f1#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.063 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d097ac7e-b4c7-45a1-8360-0b737299eb71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.065 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f942dd2-c1 in ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:26:51 np0005592767 systemd-machined[153912]: New machine qemu-30-instance-0000003e.
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.067 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f942dd2-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.067 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[912495be-2905-4409-bf47-354535863fcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[915c21db-e037-412e-afef-def20760c919]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:51Z|00215|binding|INFO|Setting lport 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae ovn-installed in OVS
Jan 22 17:26:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:51Z|00216|binding|INFO|Setting lport 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae up in Southbound
Jan 22 17:26:51 np0005592767 systemd[1]: Started Virtual Machine qemu-30-instance-0000003e.
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.082 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.086 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a2f53e-4e2e-4e1f-a69a-fb27bfb49c08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 systemd-udevd[219774]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.1055] device (tap5524effc-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.1069] device (tap5524effc-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.105 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a811845-7d1a-431f-9441-aa8431fe53c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.141 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fa783271-895f-413e-b2fa-f56d14e90fac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.1496] manager: (tap8f942dd2-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.148 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bc38979b-61c6-4d2e-9975-2d5113891899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.187 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5f3499-eac7-4709-8626-20b23c21ea94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.191 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0f348966-9fcc-41c8-9ec7-a2c6615c2fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.2114] device (tap8f942dd2-c0): carrier: link connected
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.215 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[772cc365-886e-4a19-b9e5-6ed00f732174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.232 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2c8d7d-5658-4fb8-b50b-a81915f0deed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 20004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219815, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.245 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5a2b8e-5bc8-4e70-9e1b-97f30f3e625c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:b44e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440778, 'tstamp': 440778}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219823, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.262 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[34afd929-971a-4a48-aa27-84d169299cf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 20004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219831, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 podman[219794]: 2026-01-22 22:26:51.278718898 +0000 UTC m=+0.085026482 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.294 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[06574091-5185-4cff-9833-bdf235b94f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.345 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[05ec08ef-8953-46a4-a06b-d2c613e888af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.347 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.347 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.347 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f942dd2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:51 np0005592767 NetworkManager[54973]: <info>  [1769120811.3498] manager: (tap8f942dd2-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.349 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 kernel: tap8f942dd2-c0: entered promiscuous mode
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.352 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.353 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f942dd2-c0, col_values=(('external_ids', {'iface-id': '2fc42b0b-3731-4102-bca2-f1c3c740a7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.354 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:26:51Z|00217|binding|INFO|Releasing lport 2fc42b0b-3731-4102-bca2-f1c3c740a7e6 from this chassis (sb_readonly=0)
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.364 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.365 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f942dd2-c635-41b4-933f-433a748048f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f942dd2-c635-41b4-933f-433a748048f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.366 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cbc4a5-5fb5-4033-a799-cca8482d2e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.367 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-8f942dd2-c635-41b4-933f-433a748048f1
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/8f942dd2-c635-41b4-933f-433a748048f1.pid.haproxy
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 8f942dd2-c635-41b4-933f-433a748048f1
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:26:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:26:51.368 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'env', 'PROCESS_TAG=haproxy-8f942dd2-c635-41b4-933f-433a748048f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f942dd2-c635-41b4-933f-433a748048f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.389 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120811.3889365, 12c7660a-27b8-417e-be1f-cccf937421a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.390 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] VM Started (Lifecycle Event)#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.410 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.414 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120811.3901262, 12c7660a-27b8-417e-be1f-cccf937421a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.415 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.428 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.431 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:26:51 np0005592767 nova_compute[182623]: 2026-01-22 22:26:51.462 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:26:51 np0005592767 podman[219870]: 2026-01-22 22:26:51.746528069 +0000 UTC m=+0.053721768 container create c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:26:51 np0005592767 systemd[1]: Started libpod-conmon-c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c.scope.
Jan 22 17:26:51 np0005592767 podman[219870]: 2026-01-22 22:26:51.718662472 +0000 UTC m=+0.025856221 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:26:51 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:26:51 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/653e9fe981ef345cdac7d18ec270f254ffe69aaf91e09920ea2b536a2ec2ca28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:26:51 np0005592767 podman[219870]: 2026-01-22 22:26:51.858905622 +0000 UTC m=+0.166099341 container init c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:26:51 np0005592767 podman[219870]: 2026-01-22 22:26:51.863959875 +0000 UTC m=+0.171153574 container start c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:26:51 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [NOTICE]   (219889) : New worker (219891) forked
Jan 22 17:26:51 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [NOTICE]   (219889) : Loading success.
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.613 182627 DEBUG nova.compute.manager [req-fa209ed1-0fbd-4676-af02-192987a3010c req-ffc72062-50b9-4393-8eaf-676cfef3a944 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.614 182627 DEBUG oslo_concurrency.lockutils [req-fa209ed1-0fbd-4676-af02-192987a3010c req-ffc72062-50b9-4393-8eaf-676cfef3a944 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.615 182627 DEBUG oslo_concurrency.lockutils [req-fa209ed1-0fbd-4676-af02-192987a3010c req-ffc72062-50b9-4393-8eaf-676cfef3a944 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.616 182627 DEBUG oslo_concurrency.lockutils [req-fa209ed1-0fbd-4676-af02-192987a3010c req-ffc72062-50b9-4393-8eaf-676cfef3a944 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.616 182627 DEBUG nova.compute.manager [req-fa209ed1-0fbd-4676-af02-192987a3010c req-ffc72062-50b9-4393-8eaf-676cfef3a944 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Processing event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.618 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.625 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120812.6247935, 12c7660a-27b8-417e-be1f-cccf937421a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.625 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.627 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.631 182627 INFO nova.virt.libvirt.driver [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Instance spawned successfully.#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.632 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.649 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.656 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.659 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.659 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.660 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.660 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.661 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.661 182627 DEBUG nova.virt.libvirt.driver [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.694 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.728 182627 DEBUG nova.network.neutron [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updated VIF entry in instance network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.729 182627 DEBUG nova.network.neutron [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.757 182627 DEBUG oslo_concurrency.lockutils [req-72fa2824-fc28-4f87-bca0-bd8ccda9a3b1 req-618222c2-027f-4036-b1a3-3da623aca820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.767 182627 INFO nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Took 8.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.767 182627 DEBUG nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.888 182627 INFO nova.compute.manager [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Took 9.29 seconds to build instance.#033[00m
Jan 22 17:26:52 np0005592767 nova_compute[182623]: 2026-01-22 22:26:52.918 182627 DEBUG oslo_concurrency.lockutils [None req-15efc3c5-4178-4e72-9ff8-863a23a8ccd2 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:53 np0005592767 nova_compute[182623]: 2026-01-22 22:26:53.667 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120798.6657834, f619e46f-8faf-4be7-bf91-55c442ffb031 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:26:53 np0005592767 nova_compute[182623]: 2026-01-22 22:26:53.668 182627 INFO nova.compute.manager [-] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:26:53 np0005592767 nova_compute[182623]: 2026-01-22 22:26:53.693 182627 DEBUG nova.compute.manager [None req-174c85dd-2a4a-41e7-b1f2-6daf11966a53 - - - - - -] [instance: f619e46f-8faf-4be7-bf91-55c442ffb031] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.754 182627 DEBUG nova.compute.manager [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.755 182627 DEBUG oslo_concurrency.lockutils [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.755 182627 DEBUG oslo_concurrency.lockutils [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.756 182627 DEBUG oslo_concurrency.lockutils [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.756 182627 DEBUG nova.compute.manager [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] No waiting events found dispatching network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.757 182627 WARNING nova.compute.manager [req-f9ca8962-02f1-48f1-a7ba-8602d616c2d3 req-43ceb65a-8b37-4443-8716-618f1700861d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received unexpected event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae for instance with vm_state active and task_state None.#033[00m
Jan 22 17:26:54 np0005592767 nova_compute[182623]: 2026-01-22 22:26:54.928 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:55 np0005592767 nova_compute[182623]: 2026-01-22 22:26:55.411 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:26:56 np0005592767 nova_compute[182623]: 2026-01-22 22:26:56.299 182627 DEBUG nova.compute.manager [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:56 np0005592767 nova_compute[182623]: 2026-01-22 22:26:56.300 182627 DEBUG nova.compute.manager [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing instance network info cache due to event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:26:56 np0005592767 nova_compute[182623]: 2026-01-22 22:26:56.300 182627 DEBUG oslo_concurrency.lockutils [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:56 np0005592767 nova_compute[182623]: 2026-01-22 22:26:56.301 182627 DEBUG oslo_concurrency.lockutils [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:56 np0005592767 nova_compute[182623]: 2026-01-22 22:26:56.301 182627 DEBUG nova.network.neutron [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:26:57 np0005592767 nova_compute[182623]: 2026-01-22 22:26:57.612 182627 DEBUG nova.compute.manager [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:26:57 np0005592767 nova_compute[182623]: 2026-01-22 22:26:57.612 182627 DEBUG nova.compute.manager [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing instance network info cache due to event network-changed-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:26:57 np0005592767 nova_compute[182623]: 2026-01-22 22:26:57.612 182627 DEBUG oslo_concurrency.lockutils [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:26:58 np0005592767 nova_compute[182623]: 2026-01-22 22:26:58.881 182627 DEBUG nova.network.neutron [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updated VIF entry in instance network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:26:58 np0005592767 nova_compute[182623]: 2026-01-22 22:26:58.882 182627 DEBUG nova.network.neutron [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:26:58 np0005592767 nova_compute[182623]: 2026-01-22 22:26:58.914 182627 DEBUG oslo_concurrency.lockutils [req-e5190c30-dfbd-4649-84df-f3d1c6a3d5dc req-d2fa6924-469a-4acc-8e3f-9f9617d0d1de 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:26:58 np0005592767 nova_compute[182623]: 2026-01-22 22:26:58.916 182627 DEBUG oslo_concurrency.lockutils [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:26:58 np0005592767 nova_compute[182623]: 2026-01-22 22:26:58.916 182627 DEBUG nova.network.neutron [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Refreshing network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:26:59 np0005592767 nova_compute[182623]: 2026-01-22 22:26:59.931 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:00 np0005592767 nova_compute[182623]: 2026-01-22 22:27:00.412 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:01 np0005592767 nova_compute[182623]: 2026-01-22 22:27:01.389 182627 DEBUG nova.network.neutron [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updated VIF entry in instance network info cache for port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:27:01 np0005592767 nova_compute[182623]: 2026-01-22 22:27:01.390 182627 DEBUG nova.network.neutron [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:01 np0005592767 nova_compute[182623]: 2026-01-22 22:27:01.407 182627 DEBUG oslo_concurrency.lockutils [req-a79688e6-d42d-47e8-823d-a58862bfec09 req-f25b3601-0d17-46c5-a4e2-2147e57e2be1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:04 np0005592767 podman[219916]: 2026-01-22 22:27:04.134097033 +0000 UTC m=+0.055753946 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:27:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:04Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:25:ac:f4 10.100.0.12
Jan 22 17:27:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:04Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:25:ac:f4 10.100.0.12
Jan 22 17:27:04 np0005592767 nova_compute[182623]: 2026-01-22 22:27:04.933 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:05 np0005592767 nova_compute[182623]: 2026-01-22 22:27:05.414 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:09 np0005592767 podman[219937]: 2026-01-22 22:27:09.152861209 +0000 UTC m=+0.062326731 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:27:09 np0005592767 podman[219936]: 2026-01-22 22:27:09.255499778 +0000 UTC m=+0.168830509 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 17:27:10 np0005592767 nova_compute[182623]: 2026-01-22 22:27:10.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:10 np0005592767 nova_compute[182623]: 2026-01-22 22:27:10.450 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:12.098 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:12.099 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:12.100 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.199 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.200 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.220 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.350 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.350 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.361 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.362 182627 INFO nova.compute.claims [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.480 182627 DEBUG nova.compute.provider_tree [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.503 182627 DEBUG nova.scheduler.client.report [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.526 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.527 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.578 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.579 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.594 182627 INFO nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.609 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.712 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.714 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.715 182627 INFO nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Creating image(s)#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.716 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.717 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.719 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.751 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.819 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.820 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.821 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.832 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.884 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.886 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.934 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.935 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.936 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.991 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.992 182627 DEBUG nova.virt.disk.api [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Checking if we can resize image /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:27:12 np0005592767 nova_compute[182623]: 2026-01-22 22:27:12.992 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.045 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.046 182627 DEBUG nova.virt.disk.api [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Cannot resize image /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.046 182627 DEBUG nova.objects.instance [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'migration_context' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.060 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.061 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Ensure instance console log exists: /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.062 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.062 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.063 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:13 np0005592767 nova_compute[182623]: 2026-01-22 22:27:13.230 182627 DEBUG nova.policy [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e7ddb71f6cbf4fc3bfbaf99b01271ec0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '059e811e196b4d02b1144af991a7abeb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.329 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Successfully created port: ad68ff09-e8bf-45c3-acaa-6ea162f80993 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.746 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.747 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.748 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.748 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.748 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.748 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:27:14 np0005592767 nova_compute[182623]: 2026-01-22 22:27:14.948 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:15 np0005592767 podman[220001]: 2026-01-22 22:27:15.159970305 +0000 UTC m=+0.060618233 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:27:15 np0005592767 podman[220000]: 2026-01-22 22:27:15.16935136 +0000 UTC m=+0.069698950 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.198 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.198 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.198 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.198 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 12c7660a-27b8-417e-be1f-cccf937421a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.266 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Successfully updated port: ad68ff09-e8bf-45c3-acaa-6ea162f80993 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.310 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.310 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquired lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.310 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.409 182627 DEBUG nova.compute.manager [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.410 182627 DEBUG nova.compute.manager [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing instance network info cache due to event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.410 182627 DEBUG oslo_concurrency.lockutils [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.452 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:15 np0005592767 nova_compute[182623]: 2026-01-22 22:27:15.539 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.943 182627 DEBUG nova.network.neutron [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.960 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Releasing lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.960 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance network_info: |[{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.961 182627 DEBUG oslo_concurrency.lockutils [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.961 182627 DEBUG nova.network.neutron [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.963 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start _get_guest_xml network_info=[{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.967 182627 WARNING nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.981 182627 DEBUG nova.virt.libvirt.host [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.982 182627 DEBUG nova.virt.libvirt.host [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.985 182627 DEBUG nova.virt.libvirt.host [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.985 182627 DEBUG nova.virt.libvirt.host [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.986 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.986 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.987 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.988 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.988 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.988 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.988 182627 DEBUG nova.virt.hardware [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.992 182627 DEBUG nova.virt.libvirt.vif [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:12Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.992 182627 DEBUG nova.network.os_vif_util [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.993 182627 DEBUG nova.network.os_vif_util [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:16 np0005592767 nova_compute[182623]: 2026-01-22 22:27:16.994 182627 DEBUG nova.objects.instance [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'pci_devices' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.015 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <uuid>84c43bd6-304f-488c-8cc1-7d740b132d6d</uuid>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <name>instance-00000041</name>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:name>tempest-SecurityGroupsTestJSON-server-139043215</nova:name>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:27:16</nova:creationTime>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:user uuid="e7ddb71f6cbf4fc3bfbaf99b01271ec0">tempest-SecurityGroupsTestJSON-999807366-project-member</nova:user>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:project uuid="059e811e196b4d02b1144af991a7abeb">tempest-SecurityGroupsTestJSON-999807366</nova:project>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        <nova:port uuid="ad68ff09-e8bf-45c3-acaa-6ea162f80993">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="serial">84c43bd6-304f-488c-8cc1-7d740b132d6d</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="uuid">84c43bd6-304f-488c-8cc1-7d740b132d6d</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:11:7a:24"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <target dev="tapad68ff09-e8"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/console.log" append="off"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:27:17 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:27:17 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:27:17 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:27:17 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.016 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Preparing to wait for external event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.016 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.016 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.017 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.018 182627 DEBUG nova.virt.libvirt.vif [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:12Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.018 182627 DEBUG nova.network.os_vif_util [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.019 182627 DEBUG nova.network.os_vif_util [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.020 182627 DEBUG os_vif [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.022 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.023 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.026 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.027 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad68ff09-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.027 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad68ff09-e8, col_values=(('external_ids', {'iface-id': 'ad68ff09-e8bf-45c3-acaa-6ea162f80993', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:7a:24', 'vm-uuid': '84c43bd6-304f-488c-8cc1-7d740b132d6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.028 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 NetworkManager[54973]: <info>  [1769120837.0304] manager: (tapad68ff09-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.031 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.041 182627 INFO os_vif [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8')#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.100 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.101 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.101 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] No VIF found with MAC fa:16:3e:11:7a:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.101 182627 INFO nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Using config drive#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.263 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [{"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.280 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-12c7660a-27b8-417e-be1f-cccf937421a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.280 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.281 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.282 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.304 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.304 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.305 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.305 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.367 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.433 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.434 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.506 182627 INFO nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Creating config drive at /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.511 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5bf1ftdf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.528 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.534 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.592 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.594 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.636 182627 DEBUG oslo_concurrency.processutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5bf1ftdf" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.657 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:17 np0005592767 kernel: tapad68ff09-e8: entered promiscuous mode
Jan 22 17:27:17 np0005592767 NetworkManager[54973]: <info>  [1769120837.6930] manager: (tapad68ff09-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.691 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:17Z|00218|binding|INFO|Claiming lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 for this chassis.
Jan 22 17:27:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:17Z|00219|binding|INFO|ad68ff09-e8bf-45c3-acaa-6ea162f80993: Claiming fa:16:3e:11:7a:24 10.100.0.13
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.700 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7a:24 10.100.0.13'], port_security=['fa:16:3e:11:7a:24 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '84c43bd6-304f-488c-8cc1-7d740b132d6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad68ff09-e8bf-45c3-acaa-6ea162f80993) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.702 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad68ff09-e8bf-45c3-acaa-6ea162f80993 in datapath 8f942dd2-c635-41b4-933f-433a748048f1 bound to our chassis#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.705 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f942dd2-c635-41b4-933f-433a748048f1#033[00m
Jan 22 17:27:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:17Z|00220|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 ovn-installed in OVS
Jan 22 17:27:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:17Z|00221|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 up in Southbound
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.708 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.711 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.725 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[702dcf51-62d7-4d14-a122-3b1e1e4045b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 systemd-udevd[220073]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:27:17 np0005592767 systemd-machined[153912]: New machine qemu-31-instance-00000041.
Jan 22 17:27:17 np0005592767 NetworkManager[54973]: <info>  [1769120837.7402] device (tapad68ff09-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:27:17 np0005592767 NetworkManager[54973]: <info>  [1769120837.7408] device (tapad68ff09-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:27:17 np0005592767 systemd[1]: Started Virtual Machine qemu-31-instance-00000041.
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.752 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[06a7f597-e856-4cb8-b357-5c895c80f9ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.755 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd7cfe4-f7ce-4441-92e3-1514ccdd9875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.781 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7342af-0516-4859-91a2-125bd4acded9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.798 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e06ff72-0214-46f4-8ba6-a5082b450b99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 29701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220086, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.814 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[69212855-3545-4a5b-a0fc-5d16b752773b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440789, 'tstamp': 440789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220088, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440791, 'tstamp': 440791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220088, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.816 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.818 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.819 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.821 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f942dd2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.821 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.821 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f942dd2-c0, col_values=(('external_ids', {'iface-id': '2fc42b0b-3731-4102-bca2-f1c3c740a7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:17.822 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.866 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.867 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5567MB free_disk=73.20694732666016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.867 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.867 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.973 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120837.973021, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.973 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Started (Lifecycle Event)#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.985 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 12c7660a-27b8-417e-be1f-cccf937421a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.985 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 84c43bd6-304f-488c-8cc1-7d740b132d6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.985 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:27:17 np0005592767 nova_compute[182623]: 2026-01-22 22:27:17.986 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.011 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.014 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120837.9755564, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.015 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.044 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.047 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.083 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.096 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.128 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.148 182627 DEBUG nova.compute.manager [req-c51e5d9e-2bd4-45bb-8f07-ad8c45c6eaf1 req-daf2fd21-95f3-473e-a983-a1f71a864ba5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.148 182627 DEBUG oslo_concurrency.lockutils [req-c51e5d9e-2bd4-45bb-8f07-ad8c45c6eaf1 req-daf2fd21-95f3-473e-a983-a1f71a864ba5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.149 182627 DEBUG oslo_concurrency.lockutils [req-c51e5d9e-2bd4-45bb-8f07-ad8c45c6eaf1 req-daf2fd21-95f3-473e-a983-a1f71a864ba5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.149 182627 DEBUG oslo_concurrency.lockutils [req-c51e5d9e-2bd4-45bb-8f07-ad8c45c6eaf1 req-daf2fd21-95f3-473e-a983-a1f71a864ba5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.149 182627 DEBUG nova.compute.manager [req-c51e5d9e-2bd4-45bb-8f07-ad8c45c6eaf1 req-daf2fd21-95f3-473e-a983-a1f71a864ba5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Processing event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.150 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.155 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120838.155442, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.155 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.157 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.161 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.162 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.162 182627 INFO nova.virt.libvirt.driver [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance spawned successfully.#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.163 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.196 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.202 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.205 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.205 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.206 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.206 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.206 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.207 182627 DEBUG nova.virt.libvirt.driver [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.242 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.317 182627 INFO nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Took 5.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.317 182627 DEBUG nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.435 182627 INFO nova.compute.manager [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Took 6.14 seconds to build instance.#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.461 182627 DEBUG oslo_concurrency.lockutils [None req-6285633a-1f10-4004-a753-da5aaa3cdf54 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.502 182627 DEBUG nova.network.neutron [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updated VIF entry in instance network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.503 182627 DEBUG nova.network.neutron [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.543 182627 DEBUG oslo_concurrency.lockutils [req-f4247d68-850e-44df-8b39-65387e2d3084 req-0dda509a-c128-421c-abd3-ac52daa2dc4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:18 np0005592767 nova_compute[182623]: 2026-01-22 22:27:18.777 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.234 182627 DEBUG nova.compute.manager [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.235 182627 DEBUG oslo_concurrency.lockutils [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.235 182627 DEBUG oslo_concurrency.lockutils [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.235 182627 DEBUG oslo_concurrency.lockutils [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.236 182627 DEBUG nova.compute.manager [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.236 182627 WARNING nova.compute.manager [req-fa5f055f-df68-49f2-ae46-50e7648cac96 req-452b1dd3-f7fb-427a-8262-0aef4c66c986 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:27:20 np0005592767 nova_compute[182623]: 2026-01-22 22:27:20.454 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.751 182627 DEBUG nova.compute.manager [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.752 182627 DEBUG nova.compute.manager [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing instance network info cache due to event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.753 182627 DEBUG oslo_concurrency.lockutils [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.753 182627 DEBUG oslo_concurrency.lockutils [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.753 182627 DEBUG nova.network.neutron [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:27:21 np0005592767 nova_compute[182623]: 2026-01-22 22:27:21.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:27:22 np0005592767 nova_compute[182623]: 2026-01-22 22:27:22.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:22 np0005592767 podman[220096]: 2026-01-22 22:27:22.144128571 +0000 UTC m=+0.063530605 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:27:22 np0005592767 nova_compute[182623]: 2026-01-22 22:27:22.545 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:22 np0005592767 nova_compute[182623]: 2026-01-22 22:27:22.545 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:22 np0005592767 nova_compute[182623]: 2026-01-22 22:27:22.545 182627 INFO nova.compute.manager [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Rebooting instance#033[00m
Jan 22 17:27:22 np0005592767 nova_compute[182623]: 2026-01-22 22:27:22.559 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:23 np0005592767 nova_compute[182623]: 2026-01-22 22:27:23.382 182627 DEBUG nova.network.neutron [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updated VIF entry in instance network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:27:23 np0005592767 nova_compute[182623]: 2026-01-22 22:27:23.382 182627 DEBUG nova.network.neutron [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:23 np0005592767 nova_compute[182623]: 2026-01-22 22:27:23.407 182627 DEBUG oslo_concurrency.lockutils [req-efd1fe6e-c9f6-4e8c-8e13-3eeb29e76281 req-6087604f-7d40-4992-8c44-b2465987ebce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:23 np0005592767 nova_compute[182623]: 2026-01-22 22:27:23.409 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquired lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:23 np0005592767 nova_compute[182623]: 2026-01-22 22:27:23.409 182627 DEBUG nova.network.neutron [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.268 182627 DEBUG nova.network.neutron [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.284 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Releasing lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.296 182627 DEBUG nova.compute.manager [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 kernel: tapad68ff09-e8 (unregistering): left promiscuous mode
Jan 22 17:27:25 np0005592767 NetworkManager[54973]: <info>  [1769120845.5119] device (tapad68ff09-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:25Z|00222|binding|INFO|Releasing lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 from this chassis (sb_readonly=0)
Jan 22 17:27:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:25Z|00223|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 down in Southbound
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.518 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:25Z|00224|binding|INFO|Removing iface tapad68ff09-e8 ovn-installed in OVS
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.521 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.531 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7a:24 10.100.0.13'], port_security=['fa:16:3e:11:7a:24 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '84c43bd6-304f-488c-8cc1-7d740b132d6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e 8f5cd7a4-2a72-4244-9eac-7246bfbad8f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad68ff09-e8bf-45c3-acaa-6ea162f80993) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.532 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad68ff09-e8bf-45c3-acaa-6ea162f80993 in datapath 8f942dd2-c635-41b4-933f-433a748048f1 unbound from our chassis#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.534 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f942dd2-c635-41b4-933f-433a748048f1#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.534 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.552 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ec55a6a9-f068-4e80-a612-eb5c772758ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000041.scope: Deactivated successfully.
Jan 22 17:27:25 np0005592767 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000041.scope: Consumed 7.530s CPU time.
Jan 22 17:27:25 np0005592767 systemd-machined[153912]: Machine qemu-31-instance-00000041 terminated.
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.577 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[77e1afdc-b257-48cc-afdf-f7f722b0e286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.579 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4770ebf4-af87-4640-8a52-09dd4c71071e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.598 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ff85e070-b04a-4661-9195-e97b4aed738c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.610 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae2f26b-110e-496e-a336-041a38a871e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 29701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220133, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.625 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a96aaea4-ed82-447f-8a9e-0583fa18fcc1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440789, 'tstamp': 440789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220134, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440791, 'tstamp': 440791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220134, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.626 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.627 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.630 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.631 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f942dd2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.631 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.632 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f942dd2-c0, col_values=(('external_ids', {'iface-id': '2fc42b0b-3731-4102-bca2-f1c3c740a7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:25.632 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.718 182627 INFO nova.virt.libvirt.driver [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance destroyed successfully.#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.720 182627 DEBUG nova.objects.instance [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'resources' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.734 182627 DEBUG nova.virt.libvirt.vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:25Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.735 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.735 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.736 182627 DEBUG os_vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.738 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.738 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad68ff09-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.740 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.741 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.744 182627 INFO os_vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8')#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.751 182627 DEBUG nova.virt.libvirt.driver [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start _get_guest_xml network_info=[{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.755 182627 WARNING nova.virt.libvirt.driver [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.761 182627 DEBUG nova.virt.libvirt.host [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.761 182627 DEBUG nova.virt.libvirt.host [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.765 182627 DEBUG nova.virt.libvirt.host [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.765 182627 DEBUG nova.virt.libvirt.host [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.766 182627 DEBUG nova.virt.libvirt.driver [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.767 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.767 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.767 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.767 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.768 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.768 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.768 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.768 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.769 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.769 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.769 182627 DEBUG nova.virt.hardware [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.769 182627 DEBUG nova.objects.instance [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.787 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.860 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.861 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.862 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.863 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.865 182627 DEBUG nova.virt.libvirt.vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:25Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.866 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.867 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.869 182627 DEBUG nova.objects.instance [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'pci_devices' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.890 182627 DEBUG nova.virt.libvirt.driver [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <uuid>84c43bd6-304f-488c-8cc1-7d740b132d6d</uuid>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <name>instance-00000041</name>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:name>tempest-SecurityGroupsTestJSON-server-139043215</nova:name>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:27:25</nova:creationTime>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:user uuid="e7ddb71f6cbf4fc3bfbaf99b01271ec0">tempest-SecurityGroupsTestJSON-999807366-project-member</nova:user>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:project uuid="059e811e196b4d02b1144af991a7abeb">tempest-SecurityGroupsTestJSON-999807366</nova:project>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        <nova:port uuid="ad68ff09-e8bf-45c3-acaa-6ea162f80993">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="serial">84c43bd6-304f-488c-8cc1-7d740b132d6d</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="uuid">84c43bd6-304f-488c-8cc1-7d740b132d6d</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk.config"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:11:7a:24"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <target dev="tapad68ff09-e8"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/console.log" append="off"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <input type="keyboard" bus="usb"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:27:25 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:27:25 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:27:25 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:27:25 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.892 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.947 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:25 np0005592767 nova_compute[182623]: 2026-01-22 22:27:25.949 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.038 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.041 182627 DEBUG nova.objects.instance [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.052 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.126 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.127 182627 DEBUG nova.virt.disk.api [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Checking if we can resize image /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.128 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.180 182627 DEBUG oslo_concurrency.processutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.181 182627 DEBUG nova.virt.disk.api [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Cannot resize image /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.181 182627 DEBUG nova.objects.instance [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'migration_context' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.196 182627 DEBUG nova.virt.libvirt.vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:25Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.197 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.197 182627 DEBUG nova.network.os_vif_util [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.198 182627 DEBUG os_vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.198 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.199 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.199 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.202 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.202 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad68ff09-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.202 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad68ff09-e8, col_values=(('external_ids', {'iface-id': 'ad68ff09-e8bf-45c3-acaa-6ea162f80993', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:7a:24', 'vm-uuid': '84c43bd6-304f-488c-8cc1-7d740b132d6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.204 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 NetworkManager[54973]: <info>  [1769120846.2050] manager: (tapad68ff09-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.206 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.208 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.209 182627 INFO os_vif [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8')#033[00m
Jan 22 17:27:26 np0005592767 kernel: tapad68ff09-e8: entered promiscuous mode
Jan 22 17:27:26 np0005592767 systemd-udevd[220127]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:27:26 np0005592767 NetworkManager[54973]: <info>  [1769120846.2854] manager: (tapad68ff09-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.285 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:26Z|00225|binding|INFO|Claiming lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 for this chassis.
Jan 22 17:27:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:26Z|00226|binding|INFO|ad68ff09-e8bf-45c3-acaa-6ea162f80993: Claiming fa:16:3e:11:7a:24 10.100.0.13
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.287 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 NetworkManager[54973]: <info>  [1769120846.2954] device (tapad68ff09-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:27:26 np0005592767 NetworkManager[54973]: <info>  [1769120846.2967] device (tapad68ff09-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.296 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7a:24 10.100.0.13'], port_security=['fa:16:3e:11:7a:24 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '84c43bd6-304f-488c-8cc1-7d740b132d6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e 8f5cd7a4-2a72-4244-9eac-7246bfbad8f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad68ff09-e8bf-45c3-acaa-6ea162f80993) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:26Z|00227|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 ovn-installed in OVS
Jan 22 17:27:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:26Z|00228|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 up in Southbound
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.299 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad68ff09-e8bf-45c3-acaa-6ea162f80993 in datapath 8f942dd2-c635-41b4-933f-433a748048f1 bound to our chassis#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.300 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.304 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f942dd2-c635-41b4-933f-433a748048f1#033[00m
Jan 22 17:27:26 np0005592767 systemd-machined[153912]: New machine qemu-32-instance-00000041.
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.323 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7fef18-0fd6-4c0e-95fa-eaa4784a74aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 systemd[1]: Started Virtual Machine qemu-32-instance-00000041.
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.351 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ac164f5e-8876-470c-b3b6-1b9bd5d1319e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.355 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[580c7dfe-59fc-46af-9565-b0f5c92a01df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.386 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bca339-48f9-493d-acda-6e115ec19738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.405 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d83894e0-7c82-4346-b1b0-49ea3ffe3411]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 29701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220195, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.421 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[59df3fb6-ee16-4fc4-9444-c45c7385e109]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440789, 'tstamp': 440789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220196, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440791, 'tstamp': 440791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220196, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.423 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.425 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.427 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f942dd2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.428 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.429 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f942dd2-c0, col_values=(('external_ids', {'iface-id': '2fc42b0b-3731-4102-bca2-f1c3c740a7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:26.429 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.794 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 84c43bd6-304f-488c-8cc1-7d740b132d6d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.795 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120846.7935076, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.796 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.800 182627 DEBUG nova.compute.manager [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.805 182627 INFO nova.virt.libvirt.driver [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance rebooted successfully.#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.806 182627 DEBUG nova.compute.manager [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.834 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.839 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.889 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120846.7939427, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.890 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Started (Lifecycle Event)#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.892 182627 DEBUG oslo_concurrency.lockutils [None req-8fc5d760-e939-4c29-9554-80d7b788d5b0 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.943 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.948 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.968 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.969 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.969 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.970 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.970 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.971 182627 WARNING nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.971 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.972 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.973 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.973 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.974 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.974 182627 WARNING nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.975 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.975 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.976 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.976 182627 DEBUG oslo_concurrency.lockutils [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.977 182627 DEBUG nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:26 np0005592767 nova_compute[182623]: 2026-01-22 22:27:26.977 182627 WARNING nova.compute.manager [req-2c7a3ee8-14a2-4cd3-9b30-33ce44d9db0e req-ce54b8d7-a610-4a4f-8325-aa30eba41962 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.000 182627 DEBUG nova.compute.manager [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.000 182627 DEBUG nova.compute.manager [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing instance network info cache due to event network-changed-ad68ff09-e8bf-45c3-acaa-6ea162f80993. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.002 182627 DEBUG oslo_concurrency.lockutils [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.002 182627 DEBUG oslo_concurrency.lockutils [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.003 182627 DEBUG nova.network.neutron [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Refreshing network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.044 182627 DEBUG nova.compute.manager [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.044 182627 DEBUG oslo_concurrency.lockutils [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.045 182627 DEBUG oslo_concurrency.lockutils [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.045 182627 DEBUG oslo_concurrency.lockutils [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.045 182627 DEBUG nova.compute.manager [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.046 182627 WARNING nova.compute.manager [req-4754742f-958c-4a93-a9e2-c63346919a6d req-bb95bbfd-d63c-44a8-b3a7-6263809a0778 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.556 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.557 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.557 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.557 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.558 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.569 182627 INFO nova.compute.manager [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Terminating instance#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.583 182627 DEBUG nova.compute.manager [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:27:29 np0005592767 kernel: tapad68ff09-e8 (unregistering): left promiscuous mode
Jan 22 17:27:29 np0005592767 NetworkManager[54973]: <info>  [1769120849.5982] device (tapad68ff09-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:29Z|00229|binding|INFO|Releasing lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 from this chassis (sb_readonly=0)
Jan 22 17:27:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:29Z|00230|binding|INFO|Setting lport ad68ff09-e8bf-45c3-acaa-6ea162f80993 down in Southbound
Jan 22 17:27:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:29Z|00231|binding|INFO|Removing iface tapad68ff09-e8 ovn-installed in OVS
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.608 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.614 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7a:24 10.100.0.13'], port_security=['fa:16:3e:11:7a:24 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '84c43bd6-304f-488c-8cc1-7d740b132d6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e 8f5cd7a4-2a72-4244-9eac-7246bfbad8f3 c96f226e-0e98-4e3c-b91b-f445d2ee4d0e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad68ff09-e8bf-45c3-acaa-6ea162f80993) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.615 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad68ff09-e8bf-45c3-acaa-6ea162f80993 in datapath 8f942dd2-c635-41b4-933f-433a748048f1 unbound from our chassis#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.617 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f942dd2-c635-41b4-933f-433a748048f1#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.623 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.631 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ebc598d3-b39e-4edb-9a34-0023af9ac4ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000041.scope: Deactivated successfully.
Jan 22 17:27:29 np0005592767 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000041.scope: Consumed 3.315s CPU time.
Jan 22 17:27:29 np0005592767 systemd-machined[153912]: Machine qemu-32-instance-00000041 terminated.
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.665 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3c00f1bd-8598-4bfe-889c-bdb3481a1131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.668 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ae71b022-77b3-4d8d-b333-ea9f8692e951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.697 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[331d3124-1987-4bb5-94e3-71c3eee68ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.715 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[618b1ace-3524-4127-a5a9-220b8a3ac924]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f942dd2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:b4:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440778, 'reachable_time': 29701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220216, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.729 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[de905a30-b87d-4153-beb9-62c5aca28a61]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440789, 'tstamp': 440789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220217, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8f942dd2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440791, 'tstamp': 440791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220217, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.731 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.736 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f942dd2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f942dd2-c0, col_values=(('external_ids', {'iface-id': '2fc42b0b-3731-4102-bca2-f1c3c740a7e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:29.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.805 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.810 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.853 182627 INFO nova.virt.libvirt.driver [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Instance destroyed successfully.#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.854 182627 DEBUG nova.objects.instance [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'resources' on Instance uuid 84c43bd6-304f-488c-8cc1-7d740b132d6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.871 182627 DEBUG nova.virt.libvirt.vif [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:27:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-139043215',display_name='tempest-SecurityGroupsTestJSON-server-139043215',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-139043215',id=65,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-x0v0jtub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:26Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=84c43bd6-304f-488c-8cc1-7d740b132d6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.871 182627 DEBUG nova.network.os_vif_util [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.873 182627 DEBUG nova.network.os_vif_util [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.873 182627 DEBUG os_vif [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.877 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.877 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad68ff09-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.879 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.884 182627 INFO os_vif [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7a:24,bridge_name='br-int',has_traffic_filtering=True,id=ad68ff09-e8bf-45c3-acaa-6ea162f80993,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad68ff09-e8')#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.885 182627 INFO nova.virt.libvirt.driver [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Deleting instance files /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d_del#033[00m
Jan 22 17:27:29 np0005592767 nova_compute[182623]: 2026-01-22 22:27:29.886 182627 INFO nova.virt.libvirt.driver [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Deletion of /var/lib/nova/instances/84c43bd6-304f-488c-8cc1-7d740b132d6d_del complete#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.010 182627 INFO nova.compute.manager [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.011 182627 DEBUG oslo.service.loopingcall [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.011 182627 DEBUG nova.compute.manager [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.012 182627 DEBUG nova.network.neutron [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.432 182627 DEBUG nova.network.neutron [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updated VIF entry in instance network info cache for port ad68ff09-e8bf-45c3-acaa-6ea162f80993. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.432 182627 DEBUG nova.network.neutron [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [{"id": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "address": "fa:16:3e:11:7a:24", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad68ff09-e8", "ovs_interfaceid": "ad68ff09-e8bf-45c3-acaa-6ea162f80993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.446 182627 DEBUG oslo_concurrency.lockutils [req-2109050f-08d2-4348-af21-ee89fdc5f55c req-a4c15fad-a752-47ee-a0ac-3dee3e0548ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-84c43bd6-304f-488c-8cc1-7d740b132d6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.506 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.583 182627 DEBUG nova.network.neutron [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.602 182627 INFO nova.compute.manager [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Took 0.59 seconds to deallocate network for instance.#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.738 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.739 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.869 182627 DEBUG nova.compute.provider_tree [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.889 182627 DEBUG nova.scheduler.client.report [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.911 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:30 np0005592767 nova_compute[182623]: 2026-01-22 22:27:30.954 182627 INFO nova.scheduler.client.report [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Deleted allocations for instance 84c43bd6-304f-488c-8cc1-7d740b132d6d#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.024 182627 DEBUG oslo_concurrency.lockutils [None req-845f769c-03a7-4f3b-86ba-694427a12890 e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.077 182627 DEBUG nova.compute.manager [req-3afd1d5f-e9a3-4154-b9f9-8385ed8861b7 req-dc5e39d4-909f-402c-b4a1-d56caad9a3fd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-deleted-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.135 182627 DEBUG nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.136 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.136 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.136 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.137 182627 DEBUG nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.137 182627 WARNING nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-unplugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.137 182627 DEBUG nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.137 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.138 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.138 182627 DEBUG oslo_concurrency.lockutils [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "84c43bd6-304f-488c-8cc1-7d740b132d6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.138 182627 DEBUG nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] No waiting events found dispatching network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:31 np0005592767 nova_compute[182623]: 2026-01-22 22:27:31.138 182627 WARNING nova.compute.manager [req-602fe14e-f964-4432-8881-880dfc991f4e req-ef2ff91b-2281-43d0-bb30-da5c77e0cab3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Received unexpected event network-vif-plugged-ad68ff09-e8bf-45c3-acaa-6ea162f80993 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:27:34 np0005592767 nova_compute[182623]: 2026-01-22 22:27:34.880 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:35 np0005592767 podman[220235]: 2026-01-22 22:27:35.194069681 +0000 UTC m=+0.103764971 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 17:27:35 np0005592767 nova_compute[182623]: 2026-01-22 22:27:35.508 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.905 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.906 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.907 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.907 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.908 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.929 182627 INFO nova.compute.manager [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Terminating instance#033[00m
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.944 182627 DEBUG nova.compute.manager [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:27:37 np0005592767 kernel: tap5524effc-bc (unregistering): left promiscuous mode
Jan 22 17:27:37 np0005592767 NetworkManager[54973]: <info>  [1769120857.9775] device (tap5524effc-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:37Z|00232|binding|INFO|Releasing lport 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae from this chassis (sb_readonly=0)
Jan 22 17:27:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:37Z|00233|binding|INFO|Setting lport 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae down in Southbound
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.983 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:37Z|00234|binding|INFO|Removing iface tap5524effc-bc ovn-installed in OVS
Jan 22 17:27:37 np0005592767 nova_compute[182623]: 2026-01-22 22:27:37.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:37.989 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:ac:f4 10.100.0.12'], port_security=['fa:16:3e:25:ac:f4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '12c7660a-27b8-417e-be1f-cccf937421a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f942dd2-c635-41b4-933f-433a748048f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059e811e196b4d02b1144af991a7abeb', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0cb8815a-a8d6-4a69-980c-69d9e19d6a4e 34b332a5-a5b2-479d-a252-5d6dc557c380 915cf172-1e46-4365-94ed-fcc2a9a6d131', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db28277b-1f2f-44a5-8c51-d711c89d9d3d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:37.990 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 5524effc-bcff-46ea-90a6-d4ec2eb3b8ae in datapath 8f942dd2-c635-41b4-933f-433a748048f1 unbound from our chassis#033[00m
Jan 22 17:27:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:37.991 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f942dd2-c635-41b4-933f-433a748048f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:27:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:37.992 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a58469-5581-4dba-937b-9ce797357e83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:37.993 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1 namespace which is not needed anymore#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.005 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:38 np0005592767 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 22 17:27:38 np0005592767 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003e.scope: Consumed 13.302s CPU time.
Jan 22 17:27:38 np0005592767 systemd-machined[153912]: Machine qemu-30-instance-0000003e terminated.
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [NOTICE]   (219889) : haproxy version is 2.8.14-c23fe91
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [NOTICE]   (219889) : path to executable is /usr/sbin/haproxy
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [WARNING]  (219889) : Exiting Master process...
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [WARNING]  (219889) : Exiting Master process...
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [ALERT]    (219889) : Current worker (219891) exited with code 143 (Terminated)
Jan 22 17:27:38 np0005592767 neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1[219885]: [WARNING]  (219889) : All workers exited. Exiting... (0)
Jan 22 17:27:38 np0005592767 systemd[1]: libpod-c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c.scope: Deactivated successfully.
Jan 22 17:27:38 np0005592767 podman[220281]: 2026-01-22 22:27:38.134822676 +0000 UTC m=+0.046442887 container died c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:27:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c-userdata-shm.mount: Deactivated successfully.
Jan 22 17:27:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay-653e9fe981ef345cdac7d18ec270f254ffe69aaf91e09920ea2b536a2ec2ca28-merged.mount: Deactivated successfully.
Jan 22 17:27:38 np0005592767 podman[220281]: 2026-01-22 22:27:38.173061619 +0000 UTC m=+0.084681840 container cleanup c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:27:38 np0005592767 systemd[1]: libpod-conmon-c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c.scope: Deactivated successfully.
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.205 182627 DEBUG nova.compute.manager [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-unplugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.206 182627 DEBUG oslo_concurrency.lockutils [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.206 182627 DEBUG oslo_concurrency.lockutils [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.206 182627 DEBUG oslo_concurrency.lockutils [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.207 182627 DEBUG nova.compute.manager [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] No waiting events found dispatching network-vif-unplugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.207 182627 DEBUG nova.compute.manager [req-96f70392-618e-47c5-96b9-9874f0e8e733 req-4f6a6974-0e53-4e77-b558-df957ba41c58 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-unplugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.210 182627 INFO nova.virt.libvirt.driver [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Instance destroyed successfully.#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.211 182627 DEBUG nova.objects.instance [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lazy-loading 'resources' on Instance uuid 12c7660a-27b8-417e-be1f-cccf937421a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.230 182627 DEBUG nova.virt.libvirt.vif [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2019090657',display_name='tempest-SecurityGroupsTestJSON-server-2019090657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2019090657',id=62,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:26:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059e811e196b4d02b1144af991a7abeb',ramdisk_id='',reservation_id='r-4fd99k3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-999807366',owner_user_name='tempest-SecurityGroupsTestJSON-999807366-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:26:52Z,user_data=None,user_id='e7ddb71f6cbf4fc3bfbaf99b01271ec0',uuid=12c7660a-27b8-417e-be1f-cccf937421a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.231 182627 DEBUG nova.network.os_vif_util [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converting VIF {"id": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "address": "fa:16:3e:25:ac:f4", "network": {"id": "8f942dd2-c635-41b4-933f-433a748048f1", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1952208124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "059e811e196b4d02b1144af991a7abeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5524effc-bc", "ovs_interfaceid": "5524effc-bcff-46ea-90a6-d4ec2eb3b8ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.232 182627 DEBUG nova.network.os_vif_util [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.232 182627 DEBUG os_vif [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.233 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.234 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5524effc-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:38 np0005592767 podman[220320]: 2026-01-22 22:27:38.235431137 +0000 UTC m=+0.041662362 container remove c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.236 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.238 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.240 182627 INFO os_vif [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:ac:f4,bridge_name='br-int',has_traffic_filtering=True,id=5524effc-bcff-46ea-90a6-d4ec2eb3b8ae,network=Network(8f942dd2-c635-41b4-933f-433a748048f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5524effc-bc')#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.240 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c1f8eb-2572-48d6-9cbb-68048618db35]: (4, ('Thu Jan 22 10:27:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1 (c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c)\nc0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c\nThu Jan 22 10:27:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1 (c0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c)\nc0f116e931dc0035ddcb43aaf63f0946d5ba060523b7b42cf756647b5b80f43c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.241 182627 INFO nova.virt.libvirt.driver [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Deleting instance files /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2_del#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.241 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff59ba9-7171-4702-8347-d9514a74be2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.242 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f942dd2-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.242 182627 INFO nova.virt.libvirt.driver [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Deletion of /var/lib/nova/instances/12c7660a-27b8-417e-be1f-cccf937421a2_del complete#033[00m
Jan 22 17:27:38 np0005592767 kernel: tap8f942dd2-c0: left promiscuous mode
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.247 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.255 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.257 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[36c8b635-1b1d-4030-ad96-cc22daaa4025]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.275 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[02d6f05c-2e9f-48c6-bc5d-5237a6ec3444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.276 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e82141e3-22e1-4a5b-afab-2f9eddf16748]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.289 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e66efd58-9a40-4de1-8585-8fc8995cc0f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440771, 'reachable_time': 40087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220342, 'error': None, 'target': 'ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 systemd[1]: run-netns-ovnmeta\x2d8f942dd2\x2dc635\x2d41b4\x2d933f\x2d433a748048f1.mount: Deactivated successfully.
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.294 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f942dd2-c635-41b4-933f-433a748048f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:27:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:38.295 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[96a0902c-75ef-4a1a-96cf-d67ab6a48ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.339 182627 INFO nova.compute.manager [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.340 182627 DEBUG oslo.service.loopingcall [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.340 182627 DEBUG nova.compute.manager [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:27:38 np0005592767 nova_compute[182623]: 2026-01-22 22:27:38.340 182627 DEBUG nova.network.neutron [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.029 182627 DEBUG nova.network.neutron [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.044 182627 INFO nova.compute.manager [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Took 0.70 seconds to deallocate network for instance.#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.128 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.128 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.177 182627 DEBUG nova.compute.provider_tree [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.198 182627 DEBUG nova.scheduler.client.report [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.223 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.248 182627 INFO nova.scheduler.client.report [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Deleted allocations for instance 12c7660a-27b8-417e-be1f-cccf937421a2#033[00m
Jan 22 17:27:39 np0005592767 nova_compute[182623]: 2026-01-22 22:27:39.391 182627 DEBUG oslo_concurrency.lockutils [None req-a9bdaf66-c170-4ac1-965f-45c3e1def40c e7ddb71f6cbf4fc3bfbaf99b01271ec0 059e811e196b4d02b1144af991a7abeb - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:40 np0005592767 podman[220343]: 2026-01-22 22:27:40.164492785 +0000 UTC m=+0.080212424 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:27:40 np0005592767 podman[220344]: 2026-01-22 22:27:40.178276625 +0000 UTC m=+0.094866709 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.274 182627 DEBUG nova.compute.manager [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.274 182627 DEBUG oslo_concurrency.lockutils [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.275 182627 DEBUG oslo_concurrency.lockutils [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.275 182627 DEBUG oslo_concurrency.lockutils [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "12c7660a-27b8-417e-be1f-cccf937421a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.275 182627 DEBUG nova.compute.manager [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] No waiting events found dispatching network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.275 182627 WARNING nova.compute.manager [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received unexpected event network-vif-plugged-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.275 182627 DEBUG nova.compute.manager [req-b00994a2-83a7-439e-8b60-4bd74a4323d6 req-f011b3c6-2b94-4143-9961-dcac454103cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Received event network-vif-deleted-5524effc-bcff-46ea-90a6-d4ec2eb3b8ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:40 np0005592767 nova_compute[182623]: 2026-01-22 22:27:40.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:43 np0005592767 nova_compute[182623]: 2026-01-22 22:27:43.273 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:43 np0005592767 nova_compute[182623]: 2026-01-22 22:27:43.278 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:44 np0005592767 nova_compute[182623]: 2026-01-22 22:27:44.852 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120849.8502476, 84c43bd6-304f-488c-8cc1-7d740b132d6d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:44 np0005592767 nova_compute[182623]: 2026-01-22 22:27:44.853 182627 INFO nova.compute.manager [-] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:27:44 np0005592767 nova_compute[182623]: 2026-01-22 22:27:44.872 182627 DEBUG nova.compute.manager [None req-cefabc88-30ca-4ec4-8bcc-caf2bab62926 - - - - - -] [instance: 84c43bd6-304f-488c-8cc1-7d740b132d6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:45 np0005592767 nova_compute[182623]: 2026-01-22 22:27:45.513 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:46 np0005592767 podman[220388]: 2026-01-22 22:27:46.152359426 +0000 UTC m=+0.056074609 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:27:46 np0005592767 podman[220387]: 2026-01-22 22:27:46.186828973 +0000 UTC m=+0.096055703 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.760 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.760 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.774 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.886 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.886 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.894 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:27:46 np0005592767 nova_compute[182623]: 2026-01-22 22:27:46.895 182627 INFO nova.compute.claims [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.049 182627 DEBUG nova.compute.provider_tree [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.076 182627 DEBUG nova.scheduler.client.report [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.101 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.102 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.167 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.168 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.191 182627 INFO nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.210 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.337 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.339 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.339 182627 INFO nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Creating image(s)#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.340 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.340 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.341 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.358 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:47.373 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.374 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:47.374 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.410 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.410 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.411 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.424 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.474 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.475 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.494 182627 DEBUG nova.policy [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fba86c7064d54b7e8f99276901e384ad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fddbe2a6da13452699c5bbc558443813', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.505 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.506 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.507 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.560 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.561 182627 DEBUG nova.virt.disk.api [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Checking if we can resize image /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.562 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.616 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.617 182627 DEBUG nova.virt.disk.api [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Cannot resize image /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.618 182627 DEBUG nova.objects.instance [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lazy-loading 'migration_context' on Instance uuid 171a813c-4b1b-4775-9118-bb981fe6e552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.631 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.631 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Ensure instance console log exists: /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.632 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.632 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:47 np0005592767 nova_compute[182623]: 2026-01-22 22:27:47.632 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:48 np0005592767 nova_compute[182623]: 2026-01-22 22:27:48.276 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Successfully created port: 7902cbe2-977d-40ee-9fe2-3fada95d919d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:27:48 np0005592767 nova_compute[182623]: 2026-01-22 22:27:48.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.087 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Successfully updated port: 7902cbe2-977d-40ee-9fe2-3fada95d919d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.105 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.105 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquired lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.106 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.192 182627 DEBUG nova.compute.manager [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-changed-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.193 182627 DEBUG nova.compute.manager [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Refreshing instance network info cache due to event network-changed-7902cbe2-977d-40ee-9fe2-3fada95d919d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.193 182627 DEBUG oslo_concurrency.lockutils [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:27:49 np0005592767 nova_compute[182623]: 2026-01-22 22:27:49.263 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.006 182627 DEBUG nova.network.neutron [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Updating instance_info_cache with network_info: [{"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.045 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Releasing lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.046 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Instance network_info: |[{"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.047 182627 DEBUG oslo_concurrency.lockutils [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.047 182627 DEBUG nova.network.neutron [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Refreshing network info cache for port 7902cbe2-977d-40ee-9fe2-3fada95d919d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.056 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Start _get_guest_xml network_info=[{"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.062 182627 WARNING nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.067 182627 DEBUG nova.virt.libvirt.host [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.068 182627 DEBUG nova.virt.libvirt.host [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.071 182627 DEBUG nova.virt.libvirt.host [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.071 182627 DEBUG nova.virt.libvirt.host [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.072 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.073 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.073 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.073 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.074 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.074 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.074 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.074 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.075 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.075 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.075 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.075 182627 DEBUG nova.virt.hardware [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.078 182627 DEBUG nova.virt.libvirt.vif [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-175198653',display_name='tempest-ImagesOneServerNegativeTestJSON-server-175198653',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-175198653',id=68,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fddbe2a6da13452699c5bbc558443813',ramdisk_id='',reservation_id='r-t0dbvjd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1015176105',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1015176105-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:47Z,user_data=None,user_id='fba86c7064d54b7e8f99276901e384ad',uuid=171a813c-4b1b-4775-9118-bb981fe6e552,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.079 182627 DEBUG nova.network.os_vif_util [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converting VIF {"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.079 182627 DEBUG nova.network.os_vif_util [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.080 182627 DEBUG nova.objects.instance [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lazy-loading 'pci_devices' on Instance uuid 171a813c-4b1b-4775-9118-bb981fe6e552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.092 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <uuid>171a813c-4b1b-4775-9118-bb981fe6e552</uuid>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <name>instance-00000044</name>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-175198653</nova:name>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:27:50</nova:creationTime>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:user uuid="fba86c7064d54b7e8f99276901e384ad">tempest-ImagesOneServerNegativeTestJSON-1015176105-project-member</nova:user>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:project uuid="fddbe2a6da13452699c5bbc558443813">tempest-ImagesOneServerNegativeTestJSON-1015176105</nova:project>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        <nova:port uuid="7902cbe2-977d-40ee-9fe2-3fada95d919d">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="serial">171a813c-4b1b-4775-9118-bb981fe6e552</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="uuid">171a813c-4b1b-4775-9118-bb981fe6e552</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.config"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:09:f4:f5"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <target dev="tap7902cbe2-97"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/console.log" append="off"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:27:50 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:27:50 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:27:50 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:27:50 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.093 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Preparing to wait for external event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.093 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.094 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.094 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.094 182627 DEBUG nova.virt.libvirt.vif [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-175198653',display_name='tempest-ImagesOneServerNegativeTestJSON-server-175198653',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-175198653',id=68,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fddbe2a6da13452699c5bbc558443813',ramdisk_id='',reservation_id='r-t0dbvjd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1015176105',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1015176105-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:27:47Z,user_data=None,user_id='fba86c7064d54b7e8f99276901e384ad',uuid=171a813c-4b1b-4775-9118-bb981fe6e552,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.095 182627 DEBUG nova.network.os_vif_util [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converting VIF {"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.095 182627 DEBUG nova.network.os_vif_util [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.096 182627 DEBUG os_vif [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.096 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.096 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.097 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.099 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7902cbe2-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.100 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7902cbe2-97, col_values=(('external_ids', {'iface-id': '7902cbe2-977d-40ee-9fe2-3fada95d919d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:f4:f5', 'vm-uuid': '171a813c-4b1b-4775-9118-bb981fe6e552'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.101 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.1022] manager: (tap7902cbe2-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.107 182627 INFO os_vif [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97')#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.147 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.147 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.147 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] No VIF found with MAC fa:16:3e:09:f4:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.148 182627 INFO nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Using config drive#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.553 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.594 182627 INFO nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Creating config drive at /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.config#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.599 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqgicnawe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.722 182627 DEBUG oslo_concurrency.processutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqgicnawe" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:27:50 np0005592767 kernel: tap7902cbe2-97: entered promiscuous mode
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.7880] manager: (tap7902cbe2-97): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Jan 22 17:27:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:50Z|00235|binding|INFO|Claiming lport 7902cbe2-977d-40ee-9fe2-3fada95d919d for this chassis.
Jan 22 17:27:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:50Z|00236|binding|INFO|7902cbe2-977d-40ee-9fe2-3fada95d919d: Claiming fa:16:3e:09:f4:f5 10.100.0.3
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.806 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:f4:f5 10.100.0.3'], port_security=['fa:16:3e:09:f4:f5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '171a813c-4b1b-4775-9118-bb981fe6e552', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fddbe2a6da13452699c5bbc558443813', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7eabe7ea-2a17-4ccb-a2a9-b99b0529ac3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e3ff32d-9adc-48ac-b751-9c6b2b302a81, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7902cbe2-977d-40ee-9fe2-3fada95d919d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.808 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7902cbe2-977d-40ee-9fe2-3fada95d919d in datapath fe4b5322-4c9b-44ee-837c-ac7fe98d7096 bound to our chassis#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.810 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe4b5322-4c9b-44ee-837c-ac7fe98d7096#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.822 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8d0aa5-fa1e-4d4c-b20b-e0ab72395882]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.824 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe4b5322-41 in ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.825 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe4b5322-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.825 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f5123313-753e-4e3b-a180-e35caf9c7ca5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.827 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c28c7e-acd2-4c70-b037-28f65579b086]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 systemd-udevd[220460]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.838 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[95e83291-0ba5-4d91-82b1-8964f66f4810]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.8442] device (tap7902cbe2-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.8466] device (tap7902cbe2-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:27:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:50Z|00237|binding|INFO|Setting lport 7902cbe2-977d-40ee-9fe2-3fada95d919d ovn-installed in OVS
Jan 22 17:27:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:50Z|00238|binding|INFO|Setting lport 7902cbe2-977d-40ee-9fe2-3fada95d919d up in Southbound
Jan 22 17:27:50 np0005592767 systemd-machined[153912]: New machine qemu-33-instance-00000044.
Jan 22 17:27:50 np0005592767 nova_compute[182623]: 2026-01-22 22:27:50.850 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.857 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[22e2a7b7-23ca-4ffd-877b-6d74430ceac9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 systemd[1]: Started Virtual Machine qemu-33-instance-00000044.
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.884 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[883af1e8-4f32-4d29-bf4e-273c0e8cec97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.8904] manager: (tapfe4b5322-40): new Veth device (/org/freedesktop/NetworkManager/Devices/114)
Jan 22 17:27:50 np0005592767 systemd-udevd[220464]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.890 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[902db934-b50b-4020-8b75-fcba9e604d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.919 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8db21604-a4ad-4c4d-bee1-84f8b8efe106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.922 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e566a36d-a614-43bd-baef-cc3db6bad0e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 NetworkManager[54973]: <info>  [1769120870.9428] device (tapfe4b5322-40): carrier: link connected
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.948 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b29eab93-9691-4816-ab43-86215d4e2e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.964 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17d64489-bb70-4568-acc1-93316e834f55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe4b5322-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:f9:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446752, 'reachable_time': 16840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220493, 'error': None, 'target': 'ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.978 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9175d1ac-92ea-4d60-b0a0-00d8942c7d71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:f9d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446752, 'tstamp': 446752}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220494, 'error': None, 'target': 'ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:50.993 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[59d7915c-e087-48b7-be6a-e12195ae7773]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe4b5322-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:f9:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446752, 'reachable_time': 16840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220495, 'error': None, 'target': 'ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.019 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0de16c31-02f5-4eed-aa24-32ff8a9626ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6b38bcf9-eba1-4f12-add3-d4b92df3b463]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe4b5322-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe4b5322-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.077 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:51 np0005592767 NetworkManager[54973]: <info>  [1769120871.0785] manager: (tapfe4b5322-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 22 17:27:51 np0005592767 kernel: tapfe4b5322-40: entered promiscuous mode
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.080 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.081 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe4b5322-40, col_values=(('external_ids', {'iface-id': 'ceebc8b0-317b-4980-a5c9-0e346653b164'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:51Z|00239|binding|INFO|Releasing lport ceebc8b0-317b-4980-a5c9-0e346653b164 from this chassis (sb_readonly=0)
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.092 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.092 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe4b5322-4c9b-44ee-837c-ac7fe98d7096.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe4b5322-4c9b-44ee-837c-ac7fe98d7096.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.093 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[73d0cdeb-48f3-4415-8e7b-42bdeb68e9c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.094 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-fe4b5322-4c9b-44ee-837c-ac7fe98d7096
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/fe4b5322-4c9b-44ee-837c-ac7fe98d7096.pid.haproxy
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID fe4b5322-4c9b-44ee-837c-ac7fe98d7096
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:27:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:51.094 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'env', 'PROCESS_TAG=haproxy-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe4b5322-4c9b-44ee-837c-ac7fe98d7096.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.125 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120871.1247482, 171a813c-4b1b-4775-9118-bb981fe6e552 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.126 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] VM Started (Lifecycle Event)#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.145 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.150 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120871.1249592, 171a813c-4b1b-4775-9118-bb981fe6e552 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.150 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.167 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.172 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.190 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.407 182627 DEBUG nova.compute.manager [req-71c88ae5-f26f-45bc-878a-40e6f1399d94 req-34e61aa2-2577-4e52-90fc-8c31d781522c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.408 182627 DEBUG oslo_concurrency.lockutils [req-71c88ae5-f26f-45bc-878a-40e6f1399d94 req-34e61aa2-2577-4e52-90fc-8c31d781522c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.408 182627 DEBUG oslo_concurrency.lockutils [req-71c88ae5-f26f-45bc-878a-40e6f1399d94 req-34e61aa2-2577-4e52-90fc-8c31d781522c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.408 182627 DEBUG oslo_concurrency.lockutils [req-71c88ae5-f26f-45bc-878a-40e6f1399d94 req-34e61aa2-2577-4e52-90fc-8c31d781522c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.408 182627 DEBUG nova.compute.manager [req-71c88ae5-f26f-45bc-878a-40e6f1399d94 req-34e61aa2-2577-4e52-90fc-8c31d781522c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Processing event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.409 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.415 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120871.415666, 171a813c-4b1b-4775-9118-bb981fe6e552 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.416 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.420 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.424 182627 INFO nova.virt.libvirt.driver [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Instance spawned successfully.#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.425 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.433 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.437 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.454 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.454 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.455 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.456 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.456 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.457 182627 DEBUG nova.virt.libvirt.driver [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.461 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:27:51 np0005592767 podman[220534]: 2026-01-22 22:27:51.463131922 +0000 UTC m=+0.056289086 container create e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:27:51 np0005592767 systemd[1]: Started libpod-conmon-e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16.scope.
Jan 22 17:27:51 np0005592767 podman[220534]: 2026-01-22 22:27:51.434977114 +0000 UTC m=+0.028134298 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:27:51 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.536 182627 INFO nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Took 4.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.537 182627 DEBUG nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:51 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c8dc32b724e76b5efbedd12a641d217a93908d60fadfa574b2086753e19313/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:27:51 np0005592767 podman[220534]: 2026-01-22 22:27:51.557076484 +0000 UTC m=+0.150233668 container init e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:27:51 np0005592767 podman[220534]: 2026-01-22 22:27:51.562429085 +0000 UTC m=+0.155586249 container start e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:27:51 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [NOTICE]   (220553) : New worker (220555) forked
Jan 22 17:27:51 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [NOTICE]   (220553) : Loading success.
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.638 182627 INFO nova.compute.manager [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Took 4.81 seconds to build instance.#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.662 182627 DEBUG oslo_concurrency.lockutils [None req-ffc1fc17-ca57-4104-9d4b-74d919b029c0 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.876 182627 DEBUG nova.network.neutron [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Updated VIF entry in instance network info cache for port 7902cbe2-977d-40ee-9fe2-3fada95d919d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.876 182627 DEBUG nova.network.neutron [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Updating instance_info_cache with network_info: [{"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:51 np0005592767 nova_compute[182623]: 2026-01-22 22:27:51.903 182627 DEBUG oslo_concurrency.lockutils [req-3472be84-450b-4e20-baae-9f2948b3fce1 req-d369fbc4-171b-46ac-8194-7689396bdb7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-171a813c-4b1b-4775-9118-bb981fe6e552" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.376 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.823 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.824 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.824 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.825 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.825 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.836 182627 INFO nova.compute.manager [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Terminating instance#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.850 182627 DEBUG nova.compute.manager [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:27:52 np0005592767 kernel: tap7902cbe2-97 (unregistering): left promiscuous mode
Jan 22 17:27:52 np0005592767 NetworkManager[54973]: <info>  [1769120872.8764] device (tap7902cbe2-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.884 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:52Z|00240|binding|INFO|Releasing lport 7902cbe2-977d-40ee-9fe2-3fada95d919d from this chassis (sb_readonly=0)
Jan 22 17:27:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:52Z|00241|binding|INFO|Setting lport 7902cbe2-977d-40ee-9fe2-3fada95d919d down in Southbound
Jan 22 17:27:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:27:52Z|00242|binding|INFO|Removing iface tap7902cbe2-97 ovn-installed in OVS
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.886 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.893 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:f4:f5 10.100.0.3'], port_security=['fa:16:3e:09:f4:f5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '171a813c-4b1b-4775-9118-bb981fe6e552', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fddbe2a6da13452699c5bbc558443813', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7eabe7ea-2a17-4ccb-a2a9-b99b0529ac3f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e3ff32d-9adc-48ac-b751-9c6b2b302a81, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7902cbe2-977d-40ee-9fe2-3fada95d919d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.895 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7902cbe2-977d-40ee-9fe2-3fada95d919d in datapath fe4b5322-4c9b-44ee-837c-ac7fe98d7096 unbound from our chassis#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.897 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe4b5322-4c9b-44ee-837c-ac7fe98d7096, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.898 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[01fb425e-2f86-477f-8c10-9b994ba77171]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:52.898 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096 namespace which is not needed anymore#033[00m
Jan 22 17:27:52 np0005592767 nova_compute[182623]: 2026-01-22 22:27:52.901 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:52 np0005592767 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 22 17:27:52 np0005592767 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000044.scope: Consumed 1.606s CPU time.
Jan 22 17:27:52 np0005592767 systemd-machined[153912]: Machine qemu-33-instance-00000044 terminated.
Jan 22 17:27:52 np0005592767 podman[220564]: 2026-01-22 22:27:52.9555973 +0000 UTC m=+0.058999263 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [NOTICE]   (220553) : haproxy version is 2.8.14-c23fe91
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [NOTICE]   (220553) : path to executable is /usr/sbin/haproxy
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [WARNING]  (220553) : Exiting Master process...
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [WARNING]  (220553) : Exiting Master process...
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [ALERT]    (220553) : Current worker (220555) exited with code 143 (Terminated)
Jan 22 17:27:53 np0005592767 neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096[220549]: [WARNING]  (220553) : All workers exited. Exiting... (0)
Jan 22 17:27:53 np0005592767 systemd[1]: libpod-e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16.scope: Deactivated successfully.
Jan 22 17:27:53 np0005592767 podman[220611]: 2026-01-22 22:27:53.058597758 +0000 UTC m=+0.045292334 container died e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:27:53 np0005592767 NetworkManager[54973]: <info>  [1769120873.0682] manager: (tap7902cbe2-97): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.070 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16-userdata-shm.mount: Deactivated successfully.
Jan 22 17:27:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay-83c8dc32b724e76b5efbedd12a641d217a93908d60fadfa574b2086753e19313-merged.mount: Deactivated successfully.
Jan 22 17:27:53 np0005592767 podman[220611]: 2026-01-22 22:27:53.092590562 +0000 UTC m=+0.079285148 container cleanup e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.106 182627 INFO nova.virt.libvirt.driver [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Instance destroyed successfully.#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.107 182627 DEBUG nova.objects.instance [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lazy-loading 'resources' on Instance uuid 171a813c-4b1b-4775-9118-bb981fe6e552 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:27:53 np0005592767 systemd[1]: libpod-conmon-e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16.scope: Deactivated successfully.
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.120 182627 DEBUG nova.virt.libvirt.vif [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-175198653',display_name='tempest-ImagesOneServerNegativeTestJSON-server-175198653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-175198653',id=68,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:27:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fddbe2a6da13452699c5bbc558443813',ramdisk_id='',reservation_id='r-t0dbvjd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1015176105',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1015176105-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:27:51Z,user_data=None,user_id='fba86c7064d54b7e8f99276901e384ad',uuid=171a813c-4b1b-4775-9118-bb981fe6e552,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.120 182627 DEBUG nova.network.os_vif_util [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converting VIF {"id": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "address": "fa:16:3e:09:f4:f5", "network": {"id": "fe4b5322-4c9b-44ee-837c-ac7fe98d7096", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1180030365-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fddbe2a6da13452699c5bbc558443813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7902cbe2-97", "ovs_interfaceid": "7902cbe2-977d-40ee-9fe2-3fada95d919d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.121 182627 DEBUG nova.network.os_vif_util [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.121 182627 DEBUG os_vif [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.123 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7902cbe2-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.124 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.129 182627 INFO os_vif [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:f4:f5,bridge_name='br-int',has_traffic_filtering=True,id=7902cbe2-977d-40ee-9fe2-3fada95d919d,network=Network(fe4b5322-4c9b-44ee-837c-ac7fe98d7096),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7902cbe2-97')#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.130 182627 INFO nova.virt.libvirt.driver [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Deleting instance files /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552_del#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.131 182627 INFO nova.virt.libvirt.driver [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Deletion of /var/lib/nova/instances/171a813c-4b1b-4775-9118-bb981fe6e552_del complete#033[00m
Jan 22 17:27:53 np0005592767 podman[220654]: 2026-01-22 22:27:53.167310379 +0000 UTC m=+0.046834548 container remove e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.172 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a9522d-0da8-419b-8b2c-4994a34d4867]: (4, ('Thu Jan 22 10:27:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096 (e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16)\ne0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16\nThu Jan 22 10:27:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096 (e0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16)\ne0ffa21a7f079bee34960e4513b661f33371d6992b1620a19cd2b54f25300b16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.173 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[686ff8ff-2904-4a59-bfc1-c0441c77eb34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.174 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe4b5322-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:53 np0005592767 kernel: tapfe4b5322-40: left promiscuous mode
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.187 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.190 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[440692ff-858d-4548-97da-9dc19a2158a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.196 182627 INFO nova.compute.manager [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.197 182627 DEBUG oslo.service.loopingcall [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.197 182627 DEBUG nova.compute.manager [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.197 182627 DEBUG nova.network.neutron [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.202 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b713bb-03c9-482f-b03d-096e3d2e5caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.204 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f53ae8d6-cc2c-4ff1-8a5d-d705d72f7f8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.208 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120858.2058935, 12c7660a-27b8-417e-be1f-cccf937421a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.208 182627 INFO nova.compute.manager [-] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.221 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[14bb6ba7-159b-4dc0-8602-6d353e996fed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446745, 'reachable_time': 26440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220669, 'error': None, 'target': 'ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 systemd[1]: run-netns-ovnmeta\x2dfe4b5322\x2d4c9b\x2d44ee\x2d837c\x2dac7fe98d7096.mount: Deactivated successfully.
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.224 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe4b5322-4c9b-44ee-837c-ac7fe98d7096 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:27:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:27:53.225 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8617c140-a6cc-4524-a991-cfd3424a0f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.229 182627 DEBUG nova.compute.manager [None req-8be77a91-7d63-4ce8-994c-a0644d1306e1 - - - - - -] [instance: 12c7660a-27b8-417e-be1f-cccf937421a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.813 182627 DEBUG nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.813 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.813 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 DEBUG nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] No waiting events found dispatching network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 WARNING nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received unexpected event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 DEBUG nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-unplugged-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.814 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.815 182627 DEBUG oslo_concurrency.lockutils [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.815 182627 DEBUG nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] No waiting events found dispatching network-vif-unplugged-7902cbe2-977d-40ee-9fe2-3fada95d919d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:53 np0005592767 nova_compute[182623]: 2026-01-22 22:27:53.815 182627 DEBUG nova.compute.manager [req-161c3e8d-ed31-4a9b-97f2-1ccf901c41ad req-b4b7e03b-fdc4-46b5-8f73-c76a2d38ff60 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-unplugged-7902cbe2-977d-40ee-9fe2-3fada95d919d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.018 182627 DEBUG nova.network.neutron [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.035 182627 INFO nova.compute.manager [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Took 0.84 seconds to deallocate network for instance.#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.093 182627 DEBUG nova.compute.manager [req-d5288841-5434-4640-b010-13cf676cc9d7 req-b92b5579-f5cc-4d83-8c40-f1ab48318023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-deleted-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.124 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.124 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.184 182627 DEBUG nova.compute.provider_tree [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.201 182627 DEBUG nova.scheduler.client.report [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.222 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.245 182627 INFO nova.scheduler.client.report [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Deleted allocations for instance 171a813c-4b1b-4775-9118-bb981fe6e552#033[00m
Jan 22 17:27:54 np0005592767 nova_compute[182623]: 2026-01-22 22:27:54.339 182627 DEBUG oslo_concurrency.lockutils [None req-66604393-fc77-42f2-b06f-126c2731c827 fba86c7064d54b7e8f99276901e384ad fddbe2a6da13452699c5bbc558443813 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.606 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.931 182627 DEBUG nova.compute.manager [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.931 182627 DEBUG oslo_concurrency.lockutils [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.932 182627 DEBUG oslo_concurrency.lockutils [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.932 182627 DEBUG oslo_concurrency.lockutils [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "171a813c-4b1b-4775-9118-bb981fe6e552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.932 182627 DEBUG nova.compute.manager [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] No waiting events found dispatching network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:27:55 np0005592767 nova_compute[182623]: 2026-01-22 22:27:55.932 182627 WARNING nova.compute.manager [req-1863b15b-0de1-4efe-837a-87e9b0a5b03f req-d462b792-d529-4af9-882b-f79821194386 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Received unexpected event network-vif-plugged-7902cbe2-977d-40ee-9fe2-3fada95d919d for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:27:58 np0005592767 nova_compute[182623]: 2026-01-22 22:27:58.126 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:00 np0005592767 nova_compute[182623]: 2026-01-22 22:28:00.608 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:01 np0005592767 nova_compute[182623]: 2026-01-22 22:28:01.508 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:03 np0005592767 nova_compute[182623]: 2026-01-22 22:28:03.129 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:05 np0005592767 nova_compute[182623]: 2026-01-22 22:28:05.611 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:06 np0005592767 podman[220670]: 2026-01-22 22:28:06.158413889 +0000 UTC m=+0.073474483 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:28:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:28:08 np0005592767 nova_compute[182623]: 2026-01-22 22:28:08.106 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120873.1046152, 171a813c-4b1b-4775-9118-bb981fe6e552 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:28:08 np0005592767 nova_compute[182623]: 2026-01-22 22:28:08.106 182627 INFO nova.compute.manager [-] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:28:08 np0005592767 nova_compute[182623]: 2026-01-22 22:28:08.133 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:08 np0005592767 nova_compute[182623]: 2026-01-22 22:28:08.590 182627 DEBUG nova.compute.manager [None req-7458c53c-ce45-44ae-ad06-f09ff9017979 - - - - - -] [instance: 171a813c-4b1b-4775-9118-bb981fe6e552] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:28:10 np0005592767 nova_compute[182623]: 2026-01-22 22:28:10.612 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:11 np0005592767 podman[220690]: 2026-01-22 22:28:11.166774608 +0000 UTC m=+0.076084817 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Jan 22 17:28:11 np0005592767 podman[220689]: 2026-01-22 22:28:11.229066093 +0000 UTC m=+0.146978426 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:28:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:12.099 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:12.100 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:12.100 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:13 np0005592767 nova_compute[182623]: 2026-01-22 22:28:13.136 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:13 np0005592767 nova_compute[182623]: 2026-01-22 22:28:13.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:14 np0005592767 nova_compute[182623]: 2026-01-22 22:28:14.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.657 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.913 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.913 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.945 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.946 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.946 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:15 np0005592767 nova_compute[182623]: 2026-01-22 22:28:15.946 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.091 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.092 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5737MB free_disk=73.23602676391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.092 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.092 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.151 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.151 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.198 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.229 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.256 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:28:16 np0005592767 nova_compute[182623]: 2026-01-22 22:28:16.256 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:17 np0005592767 podman[220737]: 2026-01-22 22:28:17.140980602 +0000 UTC m=+0.047080125 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:28:17 np0005592767 podman[220738]: 2026-01-22 22:28:17.165156667 +0000 UTC m=+0.075126780 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:28:18 np0005592767 nova_compute[182623]: 2026-01-22 22:28:18.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:20 np0005592767 nova_compute[182623]: 2026-01-22 22:28:20.241 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:28:20 np0005592767 nova_compute[182623]: 2026-01-22 22:28:20.659 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:23 np0005592767 nova_compute[182623]: 2026-01-22 22:28:23.141 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:23 np0005592767 podman[220779]: 2026-01-22 22:28:23.157069183 +0000 UTC m=+0.068411930 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:28:25 np0005592767 nova_compute[182623]: 2026-01-22 22:28:25.661 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:27 np0005592767 nova_compute[182623]: 2026-01-22 22:28:27.983 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:27 np0005592767 nova_compute[182623]: 2026-01-22 22:28:27.983 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.019 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.146 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.190 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.191 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.196 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.196 182627 INFO nova.compute.claims [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.299 182627 DEBUG nova.compute.provider_tree [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.316 182627 DEBUG nova.scheduler.client.report [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.349 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.350 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.412 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.412 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.428 182627 INFO nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.445 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.546 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.547 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.548 182627 INFO nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Creating image(s)#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.548 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.549 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.549 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.561 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.619 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.620 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.621 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.631 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.685 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.686 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.717 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.718 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.719 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.778 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.779 182627 DEBUG nova.virt.disk.api [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Checking if we can resize image /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.779 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.811 182627 DEBUG nova.policy [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '932ce666462f4983b308d3e827a26e5d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4f57a04bbc3f411089c62dae2e7c730b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.832 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.834 182627 DEBUG nova.virt.disk.api [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Cannot resize image /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.834 182627 DEBUG nova.objects.instance [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lazy-loading 'migration_context' on Instance uuid 583cf74b-0bbd-4315-984c-5810efd4dede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.849 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.850 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Ensure instance console log exists: /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.851 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.852 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:28 np0005592767 nova_compute[182623]: 2026-01-22 22:28:28.852 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:29 np0005592767 nova_compute[182623]: 2026-01-22 22:28:29.881 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Successfully created port: 396dcf22-5beb-4af8-9284-a57b6d018972 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:28:30 np0005592767 nova_compute[182623]: 2026-01-22 22:28:30.705 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.518 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Successfully updated port: 396dcf22-5beb-4af8-9284-a57b6d018972 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.536 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.537 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquired lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.537 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.615 182627 DEBUG nova.compute.manager [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-changed-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.615 182627 DEBUG nova.compute.manager [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Refreshing instance network info cache due to event network-changed-396dcf22-5beb-4af8-9284-a57b6d018972. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.615 182627 DEBUG oslo_concurrency.lockutils [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:28:31 np0005592767 nova_compute[182623]: 2026-01-22 22:28:31.717 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.696 182627 DEBUG nova.network.neutron [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updating instance_info_cache with network_info: [{"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.712 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Releasing lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.712 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Instance network_info: |[{"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.712 182627 DEBUG oslo_concurrency.lockutils [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.713 182627 DEBUG nova.network.neutron [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Refreshing network info cache for port 396dcf22-5beb-4af8-9284-a57b6d018972 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.716 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Start _get_guest_xml network_info=[{"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.721 182627 WARNING nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.729 182627 DEBUG nova.virt.libvirt.host [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.730 182627 DEBUG nova.virt.libvirt.host [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.733 182627 DEBUG nova.virt.libvirt.host [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.734 182627 DEBUG nova.virt.libvirt.host [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.735 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.736 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.736 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.736 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.737 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.737 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.737 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.738 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.738 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.738 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.738 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.739 182627 DEBUG nova.virt.hardware [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.743 182627 DEBUG nova.virt.libvirt.vif [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1162781629',display_name='tempest-ServersTestManualDisk-server-1162781629',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1162781629',id=71,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDPJr46yQtMIJOAEpYJTcfiYxRqLWYaj+YeFhoSUcQ7LzXflT1kGCuSOOKCcjS9bSilBCzR8aoPwyMukKukSATIB0qHeXr7dsjEvO1JmzPkF2qAkZJpiBIoTDi2bUiVrCQ==',key_name='tempest-keypair-1174644721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f57a04bbc3f411089c62dae2e7c730b',ramdisk_id='',reservation_id='r-o0dipdfn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-590483923',owner_user_name='tempest-ServersTestManualDisk-590483923-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='932ce666462f4983b308d3e827a26e5d',uuid=583cf74b-0bbd-4315-984c-5810efd4dede,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.743 182627 DEBUG nova.network.os_vif_util [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converting VIF {"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.744 182627 DEBUG nova.network.os_vif_util [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.745 182627 DEBUG nova.objects.instance [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lazy-loading 'pci_devices' on Instance uuid 583cf74b-0bbd-4315-984c-5810efd4dede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.760 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <uuid>583cf74b-0bbd-4315-984c-5810efd4dede</uuid>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <name>instance-00000047</name>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersTestManualDisk-server-1162781629</nova:name>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:28:32</nova:creationTime>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:user uuid="932ce666462f4983b308d3e827a26e5d">tempest-ServersTestManualDisk-590483923-project-member</nova:user>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:project uuid="4f57a04bbc3f411089c62dae2e7c730b">tempest-ServersTestManualDisk-590483923</nova:project>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        <nova:port uuid="396dcf22-5beb-4af8-9284-a57b6d018972">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="serial">583cf74b-0bbd-4315-984c-5810efd4dede</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="uuid">583cf74b-0bbd-4315-984c-5810efd4dede</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.config"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:16:d5:2d"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <target dev="tap396dcf22-5b"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/console.log" append="off"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:28:32 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:28:32 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:28:32 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:28:32 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.761 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Preparing to wait for external event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.762 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.763 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.764 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.765 182627 DEBUG nova.virt.libvirt.vif [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1162781629',display_name='tempest-ServersTestManualDisk-server-1162781629',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1162781629',id=71,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDPJr46yQtMIJOAEpYJTcfiYxRqLWYaj+YeFhoSUcQ7LzXflT1kGCuSOOKCcjS9bSilBCzR8aoPwyMukKukSATIB0qHeXr7dsjEvO1JmzPkF2qAkZJpiBIoTDi2bUiVrCQ==',key_name='tempest-keypair-1174644721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4f57a04bbc3f411089c62dae2e7c730b',ramdisk_id='',reservation_id='r-o0dipdfn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-590483923',owner_user_name='tempest-ServersTestManualDisk-590483923-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:28:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='932ce666462f4983b308d3e827a26e5d',uuid=583cf74b-0bbd-4315-984c-5810efd4dede,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.766 182627 DEBUG nova.network.os_vif_util [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converting VIF {"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.768 182627 DEBUG nova.network.os_vif_util [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.769 182627 DEBUG os_vif [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.771 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.772 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.776 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.777 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap396dcf22-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.778 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap396dcf22-5b, col_values=(('external_ids', {'iface-id': '396dcf22-5beb-4af8-9284-a57b6d018972', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:d5:2d', 'vm-uuid': '583cf74b-0bbd-4315-984c-5810efd4dede'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:32 np0005592767 NetworkManager[54973]: <info>  [1769120912.7813] manager: (tap396dcf22-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.785 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.786 182627 INFO os_vif [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b')#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.839 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.840 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.840 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] No VIF found with MAC fa:16:3e:16:d5:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:28:32 np0005592767 nova_compute[182623]: 2026-01-22 22:28:32.840 182627 INFO nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Using config drive#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.241 182627 INFO nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Creating config drive at /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.config#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.253 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5b5eaumn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.396 182627 DEBUG oslo_concurrency.processutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5b5eaumn" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:28:33 np0005592767 kernel: tap396dcf22-5b: entered promiscuous mode
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.4612] manager: (tap396dcf22-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 22 17:28:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:33Z|00243|binding|INFO|Claiming lport 396dcf22-5beb-4af8-9284-a57b6d018972 for this chassis.
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.462 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:33Z|00244|binding|INFO|396dcf22-5beb-4af8-9284-a57b6d018972: Claiming fa:16:3e:16:d5:2d 10.100.0.5
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.465 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.479 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:d5:2d 10.100.0.5'], port_security=['fa:16:3e:16:d5:2d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '583cf74b-0bbd-4315-984c-5810efd4dede', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de3069b8-f4ea-4784-8116-541810c129d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f57a04bbc3f411089c62dae2e7c730b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '586c23e2-41e0-498b-9d9e-0ccd90bd8ec8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85c42d0c-8375-4910-bc6c-d5b4e87f389b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=396dcf22-5beb-4af8-9284-a57b6d018972) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.480 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 396dcf22-5beb-4af8-9284-a57b6d018972 in datapath de3069b8-f4ea-4784-8116-541810c129d6 bound to our chassis#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.482 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de3069b8-f4ea-4784-8116-541810c129d6#033[00m
Jan 22 17:28:33 np0005592767 systemd-udevd[220837]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.492 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[58469dd4-8798-4dbf-9042-57e59e8b48c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.493 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapde3069b8-f1 in ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.495 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapde3069b8-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.496 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e1bdf3-5b6f-4385-9cc7-83414ef8b0ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.497 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2cb97d-b34d-4e21-91c3-797e2a103322]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 systemd-machined[153912]: New machine qemu-34-instance-00000047.
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.5007] device (tap396dcf22-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.5013] device (tap396dcf22-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.507 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[210e3609-ab09-4934-8635-36608c90f6ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.518 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:33Z|00245|binding|INFO|Setting lport 396dcf22-5beb-4af8-9284-a57b6d018972 ovn-installed in OVS
Jan 22 17:28:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:33Z|00246|binding|INFO|Setting lport 396dcf22-5beb-4af8-9284-a57b6d018972 up in Southbound
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.525 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.530 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[19c30b3e-db60-49df-b9c0-74f4dc6790a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 systemd[1]: Started Virtual Machine qemu-34-instance-00000047.
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.559 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[37dbfeff-e278-4ce8-a42b-8a694f022adf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.564 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a47615-4d4b-4873-95fc-77a2629ca357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.5660] manager: (tapde3069b8-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.591 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0c8367-9f7c-412e-b7a3-4195ff97737e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.594 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7233ada0-7d3f-413d-a7c4-b9a5656507e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.6148] device (tapde3069b8-f0): carrier: link connected
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.620 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[42e985c7-d34e-41e4-9e2a-f80733f79a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.636 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0e252ce1-10cc-47ed-b7ca-1b04e0dca969]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde3069b8-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:00:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451019, 'reachable_time': 27918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220871, 'error': None, 'target': 'ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.651 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5b649e-6a6f-4aeb-89f8-6fa333447a44]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451019, 'tstamp': 451019}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220872, 'error': None, 'target': 'ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.669 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[725419c9-bae2-4a97-b1f9-a0b80a420f40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde3069b8-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:00:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451019, 'reachable_time': 27918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220873, 'error': None, 'target': 'ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.695 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0aec7bd4-efc6-4958-832b-e018b2570be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.751 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[52af5943-2f78-4b54-85d4-27170040204f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.752 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde3069b8-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.753 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.753 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde3069b8-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:33 np0005592767 kernel: tapde3069b8-f0: entered promiscuous mode
Jan 22 17:28:33 np0005592767 NetworkManager[54973]: <info>  [1769120913.7559] manager: (tapde3069b8-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.755 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.760 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde3069b8-f0, col_values=(('external_ids', {'iface-id': 'ff931cac-379d-4e70-ae40-750b417b69d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.758 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:33Z|00247|binding|INFO|Releasing lport ff931cac-379d-4e70-ae40-750b417b69d2 from this chassis (sb_readonly=0)
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.765 182627 DEBUG nova.compute.manager [req-dc1ed591-6def-4254-832b-0cb1e05fb8a9 req-e131edaf-a6b8-48f5-8399-9923c566f977 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.766 182627 DEBUG oslo_concurrency.lockutils [req-dc1ed591-6def-4254-832b-0cb1e05fb8a9 req-e131edaf-a6b8-48f5-8399-9923c566f977 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.766 182627 DEBUG oslo_concurrency.lockutils [req-dc1ed591-6def-4254-832b-0cb1e05fb8a9 req-e131edaf-a6b8-48f5-8399-9923c566f977 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.766 182627 DEBUG oslo_concurrency.lockutils [req-dc1ed591-6def-4254-832b-0cb1e05fb8a9 req-e131edaf-a6b8-48f5-8399-9923c566f977 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.767 182627 DEBUG nova.compute.manager [req-dc1ed591-6def-4254-832b-0cb1e05fb8a9 req-e131edaf-a6b8-48f5-8399-9923c566f977 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Processing event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 nova_compute[182623]: 2026-01-22 22:28:33.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.782 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/de3069b8-f4ea-4784-8116-541810c129d6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/de3069b8-f4ea-4784-8116-541810c129d6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.783 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e25e3f4f-72c9-48b4-ada5-2c83fedb6dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.784 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-de3069b8-f4ea-4784-8116-541810c129d6
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/de3069b8-f4ea-4784-8116-541810c129d6.pid.haproxy
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID de3069b8-f4ea-4784-8116-541810c129d6
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:28:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:33.784 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6', 'env', 'PROCESS_TAG=haproxy-de3069b8-f4ea-4784-8116-541810c129d6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/de3069b8-f4ea-4784-8116-541810c129d6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:28:34 np0005592767 podman[220908]: 2026-01-22 22:28:34.116122108 +0000 UTC m=+0.045456639 container create 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:28:34 np0005592767 systemd[1]: Started libpod-conmon-979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc.scope.
Jan 22 17:28:34 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:28:34 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12da75f8f3414191bd3034fb2bf5b9a240f8d2e1eb9d71ce34daf346365ca051/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:28:34 np0005592767 podman[220908]: 2026-01-22 22:28:34.092742856 +0000 UTC m=+0.022077417 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.193 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.194 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120914.192915, 583cf74b-0bbd-4315-984c-5810efd4dede => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.195 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] VM Started (Lifecycle Event)#033[00m
Jan 22 17:28:34 np0005592767 podman[220908]: 2026-01-22 22:28:34.199220503 +0000 UTC m=+0.128555034 container init 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.199 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.203 182627 INFO nova.virt.libvirt.driver [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Instance spawned successfully.#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.204 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:28:34 np0005592767 podman[220908]: 2026-01-22 22:28:34.205320145 +0000 UTC m=+0.134654676 container start 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.226 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:28:34 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [NOTICE]   (220933) : New worker (220935) forked
Jan 22 17:28:34 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [NOTICE]   (220933) : Loading success.
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.235 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.239 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.240 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.240 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.241 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.242 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.242 182627 DEBUG nova.virt.libvirt.driver [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.259 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.260 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120914.1930702, 583cf74b-0bbd-4315-984c-5810efd4dede => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.260 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.423 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.428 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120914.1982985, 583cf74b-0bbd-4315-984c-5810efd4dede => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.428 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.457 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.460 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.466 182627 INFO nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Took 5.92 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.466 182627 DEBUG nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.476 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.544 182627 INFO nova.compute.manager [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Took 6.40 seconds to build instance.#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.574 182627 DEBUG nova.network.neutron [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updated VIF entry in instance network info cache for port 396dcf22-5beb-4af8-9284-a57b6d018972. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.574 182627 DEBUG nova.network.neutron [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updating instance_info_cache with network_info: [{"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.584 182627 DEBUG oslo_concurrency.lockutils [None req-db3ff702-07db-467b-8a94-9bbe13fbe8ce 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:34 np0005592767 nova_compute[182623]: 2026-01-22 22:28:34.592 182627 DEBUG oslo_concurrency.lockutils [req-f7d5f733-1889-429c-8d61-f21fac5e3066 req-4dddb09f-c581-47ca-b96d-8fc414d9f82f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.706 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.857 182627 DEBUG nova.compute.manager [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.858 182627 DEBUG oslo_concurrency.lockutils [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.858 182627 DEBUG oslo_concurrency.lockutils [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.859 182627 DEBUG oslo_concurrency.lockutils [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.859 182627 DEBUG nova.compute.manager [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] No waiting events found dispatching network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:28:35 np0005592767 nova_compute[182623]: 2026-01-22 22:28:35.860 182627 WARNING nova.compute.manager [req-08ee0f09-5187-47e3-bdd8-867c0e76c218 req-382c40ce-5aa9-4e2f-b65d-3a60f36dde3e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received unexpected event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:28:37 np0005592767 podman[220944]: 2026-01-22 22:28:37.168419702 +0000 UTC m=+0.077573339 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.334 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:37 np0005592767 NetworkManager[54973]: <info>  [1769120917.3361] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 22 17:28:37 np0005592767 NetworkManager[54973]: <info>  [1769120917.3378] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.487 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:37Z|00248|binding|INFO|Releasing lport ff931cac-379d-4e70-ae40-750b417b69d2 from this chassis (sb_readonly=0)
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.514 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.532 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.651 182627 DEBUG nova.compute.manager [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-changed-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.652 182627 DEBUG nova.compute.manager [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Refreshing instance network info cache due to event network-changed-396dcf22-5beb-4af8-9284-a57b6d018972. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.652 182627 DEBUG oslo_concurrency.lockutils [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.653 182627 DEBUG oslo_concurrency.lockutils [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.653 182627 DEBUG nova.network.neutron [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Refreshing network info cache for port 396dcf22-5beb-4af8-9284-a57b6d018972 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:28:37 np0005592767 nova_compute[182623]: 2026-01-22 22:28:37.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:39 np0005592767 nova_compute[182623]: 2026-01-22 22:28:39.113 182627 DEBUG nova.network.neutron [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updated VIF entry in instance network info cache for port 396dcf22-5beb-4af8-9284-a57b6d018972. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:28:39 np0005592767 nova_compute[182623]: 2026-01-22 22:28:39.114 182627 DEBUG nova.network.neutron [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updating instance_info_cache with network_info: [{"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:28:39 np0005592767 nova_compute[182623]: 2026-01-22 22:28:39.134 182627 DEBUG oslo_concurrency.lockutils [req-49256584-27f1-4994-b25e-8b053bf3db44 req-e52e70d2-1363-4e53-a666-02457561cf3f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-583cf74b-0bbd-4315-984c-5810efd4dede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:28:40 np0005592767 nova_compute[182623]: 2026-01-22 22:28:40.707 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:40Z|00249|binding|INFO|Releasing lport ff931cac-379d-4e70-ae40-750b417b69d2 from this chassis (sb_readonly=0)
Jan 22 17:28:41 np0005592767 nova_compute[182623]: 2026-01-22 22:28:41.068 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:41 np0005592767 nova_compute[182623]: 2026-01-22 22:28:41.873 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:42 np0005592767 podman[220963]: 2026-01-22 22:28:42.13937248 +0000 UTC m=+0.057417988 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Jan 22 17:28:42 np0005592767 podman[220962]: 2026-01-22 22:28:42.173045414 +0000 UTC m=+0.084688481 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:28:42 np0005592767 nova_compute[182623]: 2026-01-22 22:28:42.818 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:44 np0005592767 nova_compute[182623]: 2026-01-22 22:28:44.840 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:45Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:16:d5:2d 10.100.0.5
Jan 22 17:28:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:45Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:16:d5:2d 10.100.0.5
Jan 22 17:28:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:45Z|00250|binding|INFO|Releasing lport ff931cac-379d-4e70-ae40-750b417b69d2 from this chassis (sb_readonly=0)
Jan 22 17:28:45 np0005592767 nova_compute[182623]: 2026-01-22 22:28:45.715 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:45 np0005592767 nova_compute[182623]: 2026-01-22 22:28:45.719 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:47.526 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:28:47 np0005592767 nova_compute[182623]: 2026-01-22 22:28:47.527 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:47.529 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:28:47 np0005592767 nova_compute[182623]: 2026-01-22 22:28:47.820 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:48 np0005592767 podman[221020]: 2026-01-22 22:28:48.150367257 +0000 UTC m=+0.056470681 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:28:48 np0005592767 podman[221019]: 2026-01-22 22:28:48.170454486 +0000 UTC m=+0.076443417 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:28:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:50.532 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:50 np0005592767 nova_compute[182623]: 2026-01-22 22:28:50.720 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:51Z|00251|binding|INFO|Releasing lport ff931cac-379d-4e70-ae40-750b417b69d2 from this chassis (sb_readonly=0)
Jan 22 17:28:51 np0005592767 nova_compute[182623]: 2026-01-22 22:28:51.405 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:52 np0005592767 nova_compute[182623]: 2026-01-22 22:28:52.823 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:54 np0005592767 podman[221062]: 2026-01-22 22:28:54.126558018 +0000 UTC m=+0.049486773 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.666 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.666 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.666 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.667 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.667 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.680 182627 INFO nova.compute.manager [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Terminating instance#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.690 182627 DEBUG nova.compute.manager [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:28:55 np0005592767 kernel: tap396dcf22-5b (unregistering): left promiscuous mode
Jan 22 17:28:55 np0005592767 NetworkManager[54973]: <info>  [1769120935.7216] device (tap396dcf22-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:28:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:55Z|00252|binding|INFO|Releasing lport 396dcf22-5beb-4af8-9284-a57b6d018972 from this chassis (sb_readonly=0)
Jan 22 17:28:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:55Z|00253|binding|INFO|Setting lport 396dcf22-5beb-4af8-9284-a57b6d018972 down in Southbound
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.735 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:28:55Z|00254|binding|INFO|Removing iface tap396dcf22-5b ovn-installed in OVS
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.738 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:55.743 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:d5:2d 10.100.0.5'], port_security=['fa:16:3e:16:d5:2d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '583cf74b-0bbd-4315-984c-5810efd4dede', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de3069b8-f4ea-4784-8116-541810c129d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4f57a04bbc3f411089c62dae2e7c730b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '586c23e2-41e0-498b-9d9e-0ccd90bd8ec8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85c42d0c-8375-4910-bc6c-d5b4e87f389b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=396dcf22-5beb-4af8-9284-a57b6d018972) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:28:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:55.745 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 396dcf22-5beb-4af8-9284-a57b6d018972 in datapath de3069b8-f4ea-4784-8116-541810c129d6 unbound from our chassis#033[00m
Jan 22 17:28:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:55.747 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de3069b8-f4ea-4784-8116-541810c129d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:28:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:55.751 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[20d1ec60-2829-4540-90a2-7059a2fcde59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:55.752 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6 namespace which is not needed anymore#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.764 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000047.scope: Deactivated successfully.
Jan 22 17:28:55 np0005592767 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000047.scope: Consumed 12.633s CPU time.
Jan 22 17:28:55 np0005592767 systemd-machined[153912]: Machine qemu-34-instance-00000047 terminated.
Jan 22 17:28:55 np0005592767 NetworkManager[54973]: <info>  [1769120935.9107] manager: (tap396dcf22-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Jan 22 17:28:55 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [NOTICE]   (220933) : haproxy version is 2.8.14-c23fe91
Jan 22 17:28:55 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [NOTICE]   (220933) : path to executable is /usr/sbin/haproxy
Jan 22 17:28:55 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [WARNING]  (220933) : Exiting Master process...
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.912 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [ALERT]    (220933) : Current worker (220935) exited with code 143 (Terminated)
Jan 22 17:28:55 np0005592767 neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6[220929]: [WARNING]  (220933) : All workers exited. Exiting... (0)
Jan 22 17:28:55 np0005592767 systemd[1]: libpod-979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc.scope: Deactivated successfully.
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.919 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 podman[221111]: 2026-01-22 22:28:55.923992706 +0000 UTC m=+0.053723863 container died 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.955 182627 INFO nova.virt.libvirt.driver [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Instance destroyed successfully.#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.956 182627 DEBUG nova.objects.instance [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lazy-loading 'resources' on Instance uuid 583cf74b-0bbd-4315-984c-5810efd4dede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:28:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc-userdata-shm.mount: Deactivated successfully.
Jan 22 17:28:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay-12da75f8f3414191bd3034fb2bf5b9a240f8d2e1eb9d71ce34daf346365ca051-merged.mount: Deactivated successfully.
Jan 22 17:28:55 np0005592767 podman[221111]: 2026-01-22 22:28:55.964414642 +0000 UTC m=+0.094145789 container cleanup 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.970 182627 DEBUG nova.virt.libvirt.vif [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1162781629',display_name='tempest-ServersTestManualDisk-server-1162781629',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1162781629',id=71,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDPJr46yQtMIJOAEpYJTcfiYxRqLWYaj+YeFhoSUcQ7LzXflT1kGCuSOOKCcjS9bSilBCzR8aoPwyMukKukSATIB0qHeXr7dsjEvO1JmzPkF2qAkZJpiBIoTDi2bUiVrCQ==',key_name='tempest-keypair-1174644721',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:28:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4f57a04bbc3f411089c62dae2e7c730b',ramdisk_id='',reservation_id='r-o0dipdfn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-590483923',owner_user_name='tempest-ServersTestManualDisk-590483923-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:28:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='932ce666462f4983b308d3e827a26e5d',uuid=583cf74b-0bbd-4315-984c-5810efd4dede,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.971 182627 DEBUG nova.network.os_vif_util [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converting VIF {"id": "396dcf22-5beb-4af8-9284-a57b6d018972", "address": "fa:16:3e:16:d5:2d", "network": {"id": "de3069b8-f4ea-4784-8116-541810c129d6", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-364913651-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4f57a04bbc3f411089c62dae2e7c730b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap396dcf22-5b", "ovs_interfaceid": "396dcf22-5beb-4af8-9284-a57b6d018972", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.971 182627 DEBUG nova.network.os_vif_util [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.972 182627 DEBUG os_vif [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.974 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.974 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap396dcf22-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.977 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.978 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:28:55 np0005592767 systemd[1]: libpod-conmon-979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc.scope: Deactivated successfully.
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.982 182627 INFO os_vif [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:d5:2d,bridge_name='br-int',has_traffic_filtering=True,id=396dcf22-5beb-4af8-9284-a57b6d018972,network=Network(de3069b8-f4ea-4784-8116-541810c129d6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap396dcf22-5b')#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.983 182627 INFO nova.virt.libvirt.driver [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Deleting instance files /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede_del#033[00m
Jan 22 17:28:55 np0005592767 nova_compute[182623]: 2026-01-22 22:28:55.983 182627 INFO nova.virt.libvirt.driver [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Deletion of /var/lib/nova/instances/583cf74b-0bbd-4315-984c-5810efd4dede_del complete#033[00m
Jan 22 17:28:56 np0005592767 podman[221154]: 2026-01-22 22:28:56.030738381 +0000 UTC m=+0.045468780 container remove 979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.035 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e13f2627-222f-4f99-86c1-d6bc92ca508b]: (4, ('Thu Jan 22 10:28:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6 (979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc)\n979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc\nThu Jan 22 10:28:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6 (979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc)\n979df59ec7869ba12800ed0628ef339e099cc9dfc083d5073049fc2d1d22e9bc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.037 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f66590-c5d5-4329-aee9-6a2ed59a98c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.037 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde3069b8-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:56 np0005592767 kernel: tapde3069b8-f0: left promiscuous mode
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.048 182627 INFO nova.compute.manager [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.048 182627 DEBUG oslo.service.loopingcall [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.049 182627 DEBUG nova.compute.manager [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.049 182627 DEBUG nova.network.neutron [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.055 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4adc1914-ce38-4d84-82a8-df2602cd7ffa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.078 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6474e987-ff86-46d4-bef2-c1c7380ddd0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.080 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[26fa55fc-0167-42fd-af03-13b7a189f177]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.097 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a178578b-e53c-494f-a3c0-58fabe7fb31b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451013, 'reachable_time': 26422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221169, 'error': None, 'target': 'ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 systemd[1]: run-netns-ovnmeta\x2dde3069b8\x2df4ea\x2d4784\x2d8116\x2d541810c129d6.mount: Deactivated successfully.
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.102 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-de3069b8-f4ea-4784-8116-541810c129d6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:28:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:28:56.102 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[fed6cf6c-c61e-483d-b7dd-0718c88f61c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.178 182627 DEBUG nova.compute.manager [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-unplugged-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.178 182627 DEBUG oslo_concurrency.lockutils [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.178 182627 DEBUG oslo_concurrency.lockutils [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.179 182627 DEBUG oslo_concurrency.lockutils [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.179 182627 DEBUG nova.compute.manager [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] No waiting events found dispatching network-vif-unplugged-396dcf22-5beb-4af8-9284-a57b6d018972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.179 182627 DEBUG nova.compute.manager [req-32bd8c3a-3701-4bcb-b4f1-7126112f6105 req-50aa0524-d4d7-4eba-bbf6-59dcc0020e9d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-unplugged-396dcf22-5beb-4af8-9284-a57b6d018972 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:28:56 np0005592767 nova_compute[182623]: 2026-01-22 22:28:56.611 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.137 182627 DEBUG nova.network.neutron [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.182 182627 INFO nova.compute.manager [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Took 1.13 seconds to deallocate network for instance.#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.267 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.267 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.333 182627 DEBUG nova.compute.provider_tree [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.380 182627 DEBUG nova.scheduler.client.report [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.408 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.442 182627 INFO nova.scheduler.client.report [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Deleted allocations for instance 583cf74b-0bbd-4315-984c-5810efd4dede#033[00m
Jan 22 17:28:57 np0005592767 nova_compute[182623]: 2026-01-22 22:28:57.534 182627 DEBUG oslo_concurrency.lockutils [None req-f9906eba-b077-4812-a2b8-81ebd9d04299 932ce666462f4983b308d3e827a26e5d 4f57a04bbc3f411089c62dae2e7c730b - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.277 182627 DEBUG nova.compute.manager [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.277 182627 DEBUG oslo_concurrency.lockutils [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.277 182627 DEBUG oslo_concurrency.lockutils [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.278 182627 DEBUG oslo_concurrency.lockutils [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "583cf74b-0bbd-4315-984c-5810efd4dede-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.278 182627 DEBUG nova.compute.manager [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] No waiting events found dispatching network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.278 182627 WARNING nova.compute.manager [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received unexpected event network-vif-plugged-396dcf22-5beb-4af8-9284-a57b6d018972 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:28:58 np0005592767 nova_compute[182623]: 2026-01-22 22:28:58.278 182627 DEBUG nova.compute.manager [req-bf452600-7a41-4f85-bff9-85ae531d8d68 req-549a2c7f-8bbd-4e92-b5c7-88322ddd3804 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Received event network-vif-deleted-396dcf22-5beb-4af8-9284-a57b6d018972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:28:59 np0005592767 nova_compute[182623]: 2026-01-22 22:28:59.725 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:00 np0005592767 nova_compute[182623]: 2026-01-22 22:29:00.405 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:00 np0005592767 nova_compute[182623]: 2026-01-22 22:29:00.766 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:00 np0005592767 nova_compute[182623]: 2026-01-22 22:29:00.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:02 np0005592767 nova_compute[182623]: 2026-01-22 22:29:02.797 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:03 np0005592767 nova_compute[182623]: 2026-01-22 22:29:03.295 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:03 np0005592767 nova_compute[182623]: 2026-01-22 22:29:03.481 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.042 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.042 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.062 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.159 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.160 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.167 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.168 182627 INFO nova.compute.claims [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.288 182627 DEBUG nova.compute.provider_tree [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.309 182627 DEBUG nova.scheduler.client.report [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.327 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.327 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.402 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.403 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.422 182627 INFO nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.442 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.604 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.607 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.607 182627 INFO nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating image(s)#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.609 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.609 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.610 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.639 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.696 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.697 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.698 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.708 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.761 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.762 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.777 182627 DEBUG nova.policy [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.790 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.791 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.792 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.845 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.846 182627 DEBUG nova.virt.disk.api [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.846 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.905 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.906 182627 DEBUG nova.virt.disk.api [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.906 182627 DEBUG nova.objects.instance [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.920 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.921 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Ensure instance console log exists: /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.921 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.922 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:04 np0005592767 nova_compute[182623]: 2026-01-22 22:29:04.922 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:05 np0005592767 nova_compute[182623]: 2026-01-22 22:29:05.769 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:05 np0005592767 nova_compute[182623]: 2026-01-22 22:29:05.927 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Successfully created port: 63663dd8-0844-4e65-9378-64416c0b1178 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:29:05 np0005592767 nova_compute[182623]: 2026-01-22 22:29:05.978 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.613 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Successfully updated port: 63663dd8-0844-4e65-9378-64416c0b1178 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.645 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.645 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.646 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.720 182627 DEBUG nova.compute.manager [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-changed-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.720 182627 DEBUG nova.compute.manager [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Refreshing instance network info cache due to event network-changed-63663dd8-0844-4e65-9378-64416c0b1178. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.721 182627 DEBUG oslo_concurrency.lockutils [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:29:07 np0005592767 nova_compute[182623]: 2026-01-22 22:29:07.822 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:29:08 np0005592767 podman[221186]: 2026-01-22 22:29:08.181998418 +0000 UTC m=+0.089923299 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.771 182627 DEBUG nova.network.neutron [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updating instance_info_cache with network_info: [{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.789 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.789 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance network_info: |[{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.790 182627 DEBUG oslo_concurrency.lockutils [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.791 182627 DEBUG nova.network.neutron [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Refreshing network info cache for port 63663dd8-0844-4e65-9378-64416c0b1178 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.796 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start _get_guest_xml network_info=[{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.802 182627 WARNING nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.809 182627 DEBUG nova.virt.libvirt.host [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.810 182627 DEBUG nova.virt.libvirt.host [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.814 182627 DEBUG nova.virt.libvirt.host [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.815 182627 DEBUG nova.virt.libvirt.host [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.817 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.818 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.819 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.819 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.820 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.820 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.820 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.821 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.821 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.822 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.822 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.823 182627 DEBUG nova.virt.hardware [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.830 182627 DEBUG nova.virt.libvirt.vif [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:04Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.830 182627 DEBUG nova.network.os_vif_util [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.832 182627 DEBUG nova.network.os_vif_util [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.834 182627 DEBUG nova.objects.instance [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.851 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <uuid>8c343772-6b41-4817-ab66-4bb05c591cc0</uuid>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <name>instance-00000049</name>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1953303378</nova:name>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:29:08</nova:creationTime>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        <nova:port uuid="63663dd8-0844-4e65-9378-64416c0b1178">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="serial">8c343772-6b41-4817-ab66-4bb05c591cc0</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="uuid">8c343772-6b41-4817-ab66-4bb05c591cc0</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:01:b1:02"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <target dev="tap63663dd8-08"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/console.log" append="off"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:29:08 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:29:08 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:29:08 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:29:08 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.853 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Preparing to wait for external event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.854 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.854 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.854 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.856 182627 DEBUG nova.virt.libvirt.vif [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:04Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.856 182627 DEBUG nova.network.os_vif_util [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.857 182627 DEBUG nova.network.os_vif_util [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.858 182627 DEBUG os_vif [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.859 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.859 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.860 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.863 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.864 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63663dd8-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.865 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63663dd8-08, col_values=(('external_ids', {'iface-id': '63663dd8-0844-4e65-9378-64416c0b1178', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:b1:02', 'vm-uuid': '8c343772-6b41-4817-ab66-4bb05c591cc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:08 np0005592767 NetworkManager[54973]: <info>  [1769120948.9140] manager: (tap63663dd8-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.917 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.920 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.921 182627 INFO os_vif [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08')#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.985 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.985 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.986 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:01:b1:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:29:08 np0005592767 nova_compute[182623]: 2026-01-22 22:29:08.986 182627 INFO nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Using config drive#033[00m
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.406 182627 INFO nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating config drive at /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config#033[00m
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.415 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpda_gq0sl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.555 182627 DEBUG oslo_concurrency.processutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpda_gq0sl" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:09 np0005592767 kernel: tap63663dd8-08: entered promiscuous mode
Jan 22 17:29:09 np0005592767 NetworkManager[54973]: <info>  [1769120949.6329] manager: (tap63663dd8-08): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 22 17:29:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:09Z|00255|binding|INFO|Claiming lport 63663dd8-0844-4e65-9378-64416c0b1178 for this chassis.
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.636 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:09Z|00256|binding|INFO|63663dd8-0844-4e65-9378-64416c0b1178: Claiming fa:16:3e:01:b1:02 10.100.0.8
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.647 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.655 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.658 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.661 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:29:09 np0005592767 systemd-udevd[221225]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:09 np0005592767 NetworkManager[54973]: <info>  [1769120949.6736] device (tap63663dd8-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:29:09 np0005592767 NetworkManager[54973]: <info>  [1769120949.6744] device (tap63663dd8-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.679 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17b3a821-2776-4f37-9af5-68d104f1dad6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.680 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.683 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.683 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2b49964e-73e8-44f1-a60c-5aceb5a5ab59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.684 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56019087-48b5-40bf-b207-b30265bdb447]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 systemd-machined[153912]: New machine qemu-35-instance-00000049.
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.703 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[02c50eef-226c-42c1-811d-d3e48ae07772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:09Z|00257|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 ovn-installed in OVS
Jan 22 17:29:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:09Z|00258|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 up in Southbound
Jan 22 17:29:09 np0005592767 nova_compute[182623]: 2026-01-22 22:29:09.726 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:09 np0005592767 systemd[1]: Started Virtual Machine qemu-35-instance-00000049.
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.734 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b4711544-1e1d-4d00-993b-aaad5416e3c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.787 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ef19c6-eac0-4e86-ae84-0897672502d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.795 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3e99f394-89bf-469c-a140-23746da69f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 systemd-udevd[221228]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:09 np0005592767 NetworkManager[54973]: <info>  [1769120949.7969] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.841 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[210d07aa-cac5-4fd1-9515-03bfaa642a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.846 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b5910838-b2c9-41c1-8e72-9cb55feb715f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 NetworkManager[54973]: <info>  [1769120949.8697] device (tap354683a7-30): carrier: link connected
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.876 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e4ed5d-56cf-434f-a0dc-a4ae1a54eafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.892 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3d03b210-48e1-44ef-844e-0c7d6c5c573f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454644, 'reachable_time': 30085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221259, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.910 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9d08d2db-f9d5-41b0-aa51-2195e7e7c455]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454644, 'tstamp': 454644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221260, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.929 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[54c90885-a5e7-4038-bba0-ace585ac4278]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454644, 'reachable_time': 30085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221261, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:09.963 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c3376f-a186-4a94-8279-6aa7972625ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.008 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eda0e668-b4cc-47c4-9a49-28dd1aa802a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.010 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.010 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.011 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:10 np0005592767 NetworkManager[54973]: <info>  [1769120950.0485] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 22 17:29:10 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.051 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:10Z|00259|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.054 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.054 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.055 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[69334872-3e19-464c-b947-2caf146ec242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.056 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:29:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:10.056 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.065 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.100 182627 DEBUG nova.compute.manager [req-6d642574-51e3-4fab-83af-70cc525926f7 req-840804e8-e8ca-4421-9dee-a1ffa310a12a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.100 182627 DEBUG oslo_concurrency.lockutils [req-6d642574-51e3-4fab-83af-70cc525926f7 req-840804e8-e8ca-4421-9dee-a1ffa310a12a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.100 182627 DEBUG oslo_concurrency.lockutils [req-6d642574-51e3-4fab-83af-70cc525926f7 req-840804e8-e8ca-4421-9dee-a1ffa310a12a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.101 182627 DEBUG oslo_concurrency.lockutils [req-6d642574-51e3-4fab-83af-70cc525926f7 req-840804e8-e8ca-4421-9dee-a1ffa310a12a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.101 182627 DEBUG nova.compute.manager [req-6d642574-51e3-4fab-83af-70cc525926f7 req-840804e8-e8ca-4421-9dee-a1ffa310a12a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Processing event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.291 182627 DEBUG nova.network.neutron [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updated VIF entry in instance network info cache for port 63663dd8-0844-4e65-9378-64416c0b1178. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.293 182627 DEBUG nova.network.neutron [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updating instance_info_cache with network_info: [{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.315 182627 DEBUG oslo_concurrency.lockutils [req-9adb21d4-226f-410a-bd9f-e9c54d48f817 req-8c19936d-7e60-45a9-b649-ecee01187158 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:29:10 np0005592767 podman[221293]: 2026-01-22 22:29:10.452809609 +0000 UTC m=+0.034818038 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:29:10 np0005592767 podman[221293]: 2026-01-22 22:29:10.699012685 +0000 UTC m=+0.281021134 container create db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:29:10 np0005592767 systemd[1]: Started libpod-conmon-db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867.scope.
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:10 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:29:10 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463b937eb8760eede7fcc27f4f60b4a9eb50f1656832001a4597daac6395cd34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.801 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:29:10 np0005592767 podman[221293]: 2026-01-22 22:29:10.802567789 +0000 UTC m=+0.384576198 container init db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.803 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120950.8012805, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.806 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Started (Lifecycle Event)#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.808 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.812 182627 INFO nova.virt.libvirt.driver [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance spawned successfully.#033[00m
Jan 22 17:29:10 np0005592767 podman[221293]: 2026-01-22 22:29:10.812454739 +0000 UTC m=+0.394463148 container start db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.812 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:29:10 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [NOTICE]   (221319) : New worker (221321) forked
Jan 22 17:29:10 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [NOTICE]   (221319) : Loading success.
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.840 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.845 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.855 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.856 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.856 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.856 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.857 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.857 182627 DEBUG nova.virt.libvirt.driver [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.877 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.877 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120950.8026009, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.877 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.941 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.944 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120950.8082433, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.944 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.953 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120935.952516, 583cf74b-0bbd-4315-984c-5810efd4dede => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.954 182627 INFO nova.compute.manager [-] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.959 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.962 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.995 182627 INFO nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Took 6.39 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:29:10 np0005592767 nova_compute[182623]: 2026-01-22 22:29:10.996 182627 DEBUG nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:11 np0005592767 nova_compute[182623]: 2026-01-22 22:29:11.002 182627 DEBUG nova.compute.manager [None req-ce799024-8242-4aa1-823d-4d0dd50bd3be - - - - - -] [instance: 583cf74b-0bbd-4315-984c-5810efd4dede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:11 np0005592767 nova_compute[182623]: 2026-01-22 22:29:11.003 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:29:11 np0005592767 nova_compute[182623]: 2026-01-22 22:29:11.076 182627 INFO nova.compute.manager [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Took 6.96 seconds to build instance.#033[00m
Jan 22 17:29:11 np0005592767 nova_compute[182623]: 2026-01-22 22:29:11.093 182627 DEBUG oslo_concurrency.lockutils [None req-7fb75c14-58a3-4152-9cbe-257fdd36405d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:12.100 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:12.102 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:12.104 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.419 182627 DEBUG nova.compute.manager [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.420 182627 DEBUG oslo_concurrency.lockutils [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.420 182627 DEBUG oslo_concurrency.lockutils [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.421 182627 DEBUG oslo_concurrency.lockutils [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.421 182627 DEBUG nova.compute.manager [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] No waiting events found dispatching network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:12 np0005592767 nova_compute[182623]: 2026-01-22 22:29:12.421 182627 WARNING nova.compute.manager [req-4d32b99a-bce5-47f7-a0b0-9a9c1f832285 req-e6baa93d-6468-4180-b304-049d7643c4b9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received unexpected event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:29:13 np0005592767 podman[221331]: 2026-01-22 22:29:13.143139506 +0000 UTC m=+0.060211167 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc.)
Jan 22 17:29:13 np0005592767 podman[221330]: 2026-01-22 22:29:13.161829306 +0000 UTC m=+0.084178176 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:29:13 np0005592767 nova_compute[182623]: 2026-01-22 22:29:13.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:13 np0005592767 nova_compute[182623]: 2026-01-22 22:29:13.914 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:14 np0005592767 nova_compute[182623]: 2026-01-22 22:29:14.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.409 182627 INFO nova.compute.manager [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Rebuilding instance#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.734 182627 DEBUG nova.compute.manager [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.772 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.831 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.854 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.892 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.904 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.913 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:29:15 np0005592767 nova_compute[182623]: 2026-01-22 22:29:15.916 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.901 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.937 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.937 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.937 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:29:16 np0005592767 nova_compute[182623]: 2026-01-22 22:29:16.938 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.934 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updating instance_info_cache with network_info: [{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.957 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-8c343772-6b41-4817-ab66-4bb05c591cc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.957 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.960 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.961 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:29:18 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.961 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:18.999 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.000 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.000 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.001 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.080 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.156 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.156 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:19 np0005592767 podman[221377]: 2026-01-22 22:29:19.159892907 +0000 UTC m=+0.091182765 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:29:19 np0005592767 podman[221378]: 2026-01-22 22:29:19.169690084 +0000 UTC m=+0.103686649 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.218 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.390 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.392 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5573MB free_disk=73.23522567749023GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.392 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.393 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.571 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 8c343772-6b41-4817-ab66-4bb05c591cc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.572 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.572 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.737 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.752 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.777 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:29:19 np0005592767 nova_compute[182623]: 2026-01-22 22:29:19.778 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:20 np0005592767 nova_compute[182623]: 2026-01-22 22:29:20.774 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.715 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.910 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.910 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:22 np0005592767 nova_compute[182623]: 2026-01-22 22:29:22.910 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:29:23 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:23Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:b1:02 10.100.0.8
Jan 22 17:29:23 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:23Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:b1:02 10.100.0.8
Jan 22 17:29:23 np0005592767 nova_compute[182623]: 2026-01-22 22:29:23.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:23 np0005592767 nova_compute[182623]: 2026-01-22 22:29:23.919 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:25 np0005592767 podman[221439]: 2026-01-22 22:29:25.151575066 +0000 UTC m=+0.063930502 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:29:25 np0005592767 nova_compute[182623]: 2026-01-22 22:29:25.778 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:25 np0005592767 nova_compute[182623]: 2026-01-22 22:29:25.956 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:29:26 np0005592767 nova_compute[182623]: 2026-01-22 22:29:26.902 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:29:28 np0005592767 kernel: tap63663dd8-08 (unregistering): left promiscuous mode
Jan 22 17:29:28 np0005592767 NetworkManager[54973]: <info>  [1769120968.1343] device (tap63663dd8-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:29:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:28Z|00260|binding|INFO|Releasing lport 63663dd8-0844-4e65-9378-64416c0b1178 from this chassis (sb_readonly=0)
Jan 22 17:29:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:28Z|00261|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 down in Southbound
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:28Z|00262|binding|INFO|Removing iface tap63663dd8-08 ovn-installed in OVS
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.187 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.190 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.194 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.196 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[27e31433-0ccb-48f7-9451-c3dba46e973a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.197 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:29:28 np0005592767 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 22 17:29:28 np0005592767 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000049.scope: Consumed 13.193s CPU time.
Jan 22 17:29:28 np0005592767 systemd-machined[153912]: Machine qemu-35-instance-00000049 terminated.
Jan 22 17:29:28 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [NOTICE]   (221319) : haproxy version is 2.8.14-c23fe91
Jan 22 17:29:28 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [NOTICE]   (221319) : path to executable is /usr/sbin/haproxy
Jan 22 17:29:28 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [WARNING]  (221319) : Exiting Master process...
Jan 22 17:29:28 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [ALERT]    (221319) : Current worker (221321) exited with code 143 (Terminated)
Jan 22 17:29:28 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221315]: [WARNING]  (221319) : All workers exited. Exiting... (0)
Jan 22 17:29:28 np0005592767 systemd[1]: libpod-db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867.scope: Deactivated successfully.
Jan 22 17:29:28 np0005592767 podman[221487]: 2026-01-22 22:29:28.328593736 +0000 UTC m=+0.048190507 container died db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:29:28 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867-userdata-shm.mount: Deactivated successfully.
Jan 22 17:29:28 np0005592767 systemd[1]: var-lib-containers-storage-overlay-463b937eb8760eede7fcc27f4f60b4a9eb50f1656832001a4597daac6395cd34-merged.mount: Deactivated successfully.
Jan 22 17:29:28 np0005592767 podman[221487]: 2026-01-22 22:29:28.367360094 +0000 UTC m=+0.086956875 container cleanup db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:29:28 np0005592767 systemd[1]: libpod-conmon-db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867.scope: Deactivated successfully.
Jan 22 17:29:28 np0005592767 NetworkManager[54973]: <info>  [1769120968.4145] manager: (tap63663dd8-08): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.415 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.420 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 podman[221514]: 2026-01-22 22:29:28.433671573 +0000 UTC m=+0.043145323 container remove db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.439 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1a62006d-483d-47dd-9a6c-26c61b9f16ee]: (4, ('Thu Jan 22 10:29:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867)\ndb9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867\nThu Jan 22 10:29:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (db9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867)\ndb9e3f1070df62ed2c9379c6eff3662b3b480b972f5eb35f92b109bcf2e4b867\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.441 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[841b2dab-3923-4ec6-8290-6de9353ac386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.442 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.444 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.461 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.465 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e90f40-e7eb-4ea7-bc4d-95a53cd73eb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.486 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8eea5da6-3206-4a4f-82df-429d178644fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.487 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e0946274-78e8-4b6e-89ca-5cd1c28bd981]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.503 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[136aa8ea-8f74-4252-9fde-9f4837c7004c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454635, 'reachable_time': 41542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221546, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.507 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:29:28 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:28.508 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fc932c-d4ad-4cca-bfda-de6fa4dedb9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.563 182627 DEBUG nova.compute.manager [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-unplugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.564 182627 DEBUG oslo_concurrency.lockutils [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.564 182627 DEBUG oslo_concurrency.lockutils [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.564 182627 DEBUG oslo_concurrency.lockutils [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.564 182627 DEBUG nova.compute.manager [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] No waiting events found dispatching network-vif-unplugged-63663dd8-0844-4e65-9378-64416c0b1178 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.564 182627 WARNING nova.compute.manager [req-6361a776-bfba-4f14-a945-6b749591b3ae req-5e33acf4-c360-4381-b38e-4577c7669db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received unexpected event network-vif-unplugged-63663dd8-0844-4e65-9378-64416c0b1178 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.921 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.970 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.978 182627 INFO nova.virt.libvirt.driver [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance destroyed successfully.#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.986 182627 INFO nova.virt.libvirt.driver [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance destroyed successfully.#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.987 182627 DEBUG nova.virt.libvirt.vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:14Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.988 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.990 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.992 182627 DEBUG os_vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.996 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:28 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.996 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63663dd8-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:28.998 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.004 182627 INFO os_vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08')#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.005 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Deleting instance files /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0_del#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.006 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Deletion of /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0_del complete#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.267 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.268 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating image(s)#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.269 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.269 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.270 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.289 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.344 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.345 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.346 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.362 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.414 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.416 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.451 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.453 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.453 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.506 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.508 182627 DEBUG nova.virt.disk.api [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.509 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.584 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.586 182627 DEBUG nova.virt.disk.api [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.587 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.588 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Ensure instance console log exists: /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.589 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.589 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.590 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.594 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start _get_guest_xml network_info=[{"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.602 182627 WARNING nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.609 182627 DEBUG nova.virt.libvirt.host [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.610 182627 DEBUG nova.virt.libvirt.host [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.613 182627 DEBUG nova.virt.libvirt.host [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.614 182627 DEBUG nova.virt.libvirt.host [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.615 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.615 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.615 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.616 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.616 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.616 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.616 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.616 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.617 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.617 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.617 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.617 182627 DEBUG nova.virt.hardware [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.618 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.633 182627 DEBUG nova.virt.libvirt.vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:29Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.634 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.635 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.636 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <uuid>8c343772-6b41-4817-ab66-4bb05c591cc0</uuid>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <name>instance-00000049</name>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1953303378</nova:name>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:29:29</nova:creationTime>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        <nova:port uuid="63663dd8-0844-4e65-9378-64416c0b1178">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="serial">8c343772-6b41-4817-ab66-4bb05c591cc0</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="uuid">8c343772-6b41-4817-ab66-4bb05c591cc0</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:01:b1:02"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <target dev="tap63663dd8-08"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/console.log" append="off"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:29:29 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:29:29 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:29:29 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:29:29 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.637 182627 DEBUG nova.compute.manager [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Preparing to wait for external event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.638 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.638 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.638 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.639 182627 DEBUG nova.virt.libvirt.vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:29Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.639 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.640 182627 DEBUG nova.network.os_vif_util [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.640 182627 DEBUG os_vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.640 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.641 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.641 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.643 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63663dd8-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.644 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63663dd8-08, col_values=(('external_ids', {'iface-id': '63663dd8-0844-4e65-9378-64416c0b1178', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:b1:02', 'vm-uuid': '8c343772-6b41-4817-ab66-4bb05c591cc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.645 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 NetworkManager[54973]: <info>  [1769120969.6462] manager: (tap63663dd8-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.647 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.649 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.650 182627 INFO os_vif [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08')#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.692 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.692 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.692 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:01:b1:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.693 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Using config drive#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.709 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:29 np0005592767 nova_compute[182623]: 2026-01-22 22:29:29.738 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'keypairs' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.382 182627 INFO nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Creating config drive at /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.387 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_6eyi36 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.523 182627 DEBUG oslo_concurrency.processutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_6eyi36" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:30 np0005592767 kernel: tap63663dd8-08: entered promiscuous mode
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.5807] manager: (tap63663dd8-08): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Jan 22 17:29:30 np0005592767 systemd-udevd[221466]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.581 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:30Z|00263|binding|INFO|Claiming lport 63663dd8-0844-4e65-9378-64416c0b1178 for this chassis.
Jan 22 17:29:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:30Z|00264|binding|INFO|63663dd8-0844-4e65-9378-64416c0b1178: Claiming fa:16:3e:01:b1:02 10.100.0.8
Jan 22 17:29:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:30Z|00265|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 ovn-installed in OVS
Jan 22 17:29:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:30Z|00266|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 up in Southbound
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.594 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.595 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.5973] device (tap63663dd8-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.5979] device (tap63663dd8-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.598 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.597 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.600 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.606 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.613 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9b774a62-fba1-48c5-b046-09be45ee89cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.614 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.617 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.617 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[58e96db3-26b7-4383-944b-411ae3f0ec54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.618 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1972fcf8-d31a-4cea-8f33-65302dd13d89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 systemd-machined[153912]: New machine qemu-36-instance-00000049.
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.630 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa71ea8-b34f-426d-a757-e79f4dbd1b97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 systemd[1]: Started Virtual Machine qemu-36-instance-00000049.
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.652 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8cb84f-8ec0-491d-9b85-5a75b912e4fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.661 182627 DEBUG nova.compute.manager [req-6177def4-d548-4128-9313-8c1dafa19e27 req-ecf24f27-e6db-4ec5-acb4-101b2019b0f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.662 182627 DEBUG oslo_concurrency.lockutils [req-6177def4-d548-4128-9313-8c1dafa19e27 req-ecf24f27-e6db-4ec5-acb4-101b2019b0f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.662 182627 DEBUG oslo_concurrency.lockutils [req-6177def4-d548-4128-9313-8c1dafa19e27 req-ecf24f27-e6db-4ec5-acb4-101b2019b0f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.662 182627 DEBUG oslo_concurrency.lockutils [req-6177def4-d548-4128-9313-8c1dafa19e27 req-ecf24f27-e6db-4ec5-acb4-101b2019b0f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.663 182627 DEBUG nova.compute.manager [req-6177def4-d548-4128-9313-8c1dafa19e27 req-ecf24f27-e6db-4ec5-acb4-101b2019b0f4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Processing event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.684 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4dd26f-5c3b-4d1a-81a4-259c35725aea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.692 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c5cadbf5-4bca-4aef-9c77-3f2eff487f94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.6934] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.723 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5c9a01-33dd-4f5d-b536-1fa9aa0cc654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.725 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[287d57bb-a63c-4901-b4e7-da99541e3e90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.7476] device (tap354683a7-30): carrier: link connected
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.752 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[523bae46-e827-4faa-a3ea-b152805e2e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.770 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[01e3d06a-3e67-4872-9aeb-1586cefe4004]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456732, 'reachable_time': 30121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221613, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.787 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[05c6f453-a498-42ee-8a7a-9c68e7aa3b7d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456732, 'tstamp': 456732}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221614, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.804 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[da2f3999-5937-4b76-8bf3-ef7258802c78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456732, 'reachable_time': 30121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221615, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.835 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[998a1314-147c-4782-9792-28f65a97eb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.890 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cd011489-ff19-40c8-9936-49d908a70b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.891 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.892 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.892 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:30 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:29:30 np0005592767 NetworkManager[54973]: <info>  [1769120970.8945] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.895 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.900 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.900 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:30Z|00267|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.924 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.927 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.928 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fac61fbc-2d07-4ea9-a888-fc424ed175f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.928 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:29:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:30.929 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.964 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 8c343772-6b41-4817-ab66-4bb05c591cc0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.965 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120970.9643123, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.965 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Started (Lifecycle Event)#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.967 182627 DEBUG nova.compute.manager [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.981 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.984 182627 INFO nova.virt.libvirt.driver [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance spawned successfully.#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.984 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.989 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:30 np0005592767 nova_compute[182623]: 2026-01-22 22:29:30.991 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.012 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.012 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.013 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.013 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.014 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.014 182627 DEBUG nova.virt.libvirt.driver [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.017 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.018 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120970.9651642, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.018 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.047 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.052 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120970.9702682, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.052 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.081 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.086 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.113 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.123 182627 DEBUG nova.compute.manager [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.217 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.219 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.219 182627 DEBUG nova.objects.instance [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:29:31 np0005592767 nova_compute[182623]: 2026-01-22 22:29:31.298 182627 DEBUG oslo_concurrency.lockutils [None req-26262a7e-32b0-441c-b6cf-6228ea213510 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:31 np0005592767 podman[221653]: 2026-01-22 22:29:31.304398573 +0000 UTC m=+0.064149929 container create 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:29:31 np0005592767 systemd[1]: Started libpod-conmon-06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d.scope.
Jan 22 17:29:31 np0005592767 podman[221653]: 2026-01-22 22:29:31.266635813 +0000 UTC m=+0.026387199 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:29:31 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:29:31 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5696c857799fc76577715c22d2ffbb075004b0dc9f898f77c353155aac7f7ba9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:29:31 np0005592767 podman[221653]: 2026-01-22 22:29:31.380021926 +0000 UTC m=+0.139773332 container init 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:29:31 np0005592767 podman[221653]: 2026-01-22 22:29:31.386760877 +0000 UTC m=+0.146512243 container start 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:29:31 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [NOTICE]   (221672) : New worker (221674) forked
Jan 22 17:29:31 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [NOTICE]   (221672) : Loading success.
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.863 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.864 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.864 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.865 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.865 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.878 182627 INFO nova.compute.manager [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Terminating instance#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.892 182627 DEBUG nova.compute.manager [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:29:32 np0005592767 kernel: tap63663dd8-08 (unregistering): left promiscuous mode
Jan 22 17:29:32 np0005592767 NetworkManager[54973]: <info>  [1769120972.9178] device (tap63663dd8-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:32Z|00268|binding|INFO|Releasing lport 63663dd8-0844-4e65-9378-64416c0b1178 from this chassis (sb_readonly=0)
Jan 22 17:29:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:32Z|00269|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 down in Southbound
Jan 22 17:29:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:32Z|00270|binding|INFO|Removing iface tap63663dd8-08 ovn-installed in OVS
Jan 22 17:29:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:32.938 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:32.941 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:29:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:32.944 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:29:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:32.945 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc372c86-f990-445b-9f39-355db1cea791]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:32.945 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:32 np0005592767 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 22 17:29:32 np0005592767 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000049.scope: Consumed 2.250s CPU time.
Jan 22 17:29:32 np0005592767 systemd-machined[153912]: Machine qemu-36-instance-00000049 terminated.
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.992 182627 DEBUG nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.993 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.993 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.993 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.994 182627 DEBUG nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] No waiting events found dispatching network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.994 182627 WARNING nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received unexpected event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.994 182627 DEBUG nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.994 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.994 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.995 182627 DEBUG oslo_concurrency.lockutils [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.995 182627 DEBUG nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] No waiting events found dispatching network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:32 np0005592767 nova_compute[182623]: 2026-01-22 22:29:32.995 182627 WARNING nova.compute.manager [req-02f6af92-bbef-411e-8a30-78b5a4978dfa req-5910e7eb-4228-4de4-bab5-17ae6a2f1948 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received unexpected event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:29:33 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [NOTICE]   (221672) : haproxy version is 2.8.14-c23fe91
Jan 22 17:29:33 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [NOTICE]   (221672) : path to executable is /usr/sbin/haproxy
Jan 22 17:29:33 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [WARNING]  (221672) : Exiting Master process...
Jan 22 17:29:33 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [ALERT]    (221672) : Current worker (221674) exited with code 143 (Terminated)
Jan 22 17:29:33 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221668]: [WARNING]  (221672) : All workers exited. Exiting... (0)
Jan 22 17:29:33 np0005592767 systemd[1]: libpod-06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d.scope: Deactivated successfully.
Jan 22 17:29:33 np0005592767 podman[221708]: 2026-01-22 22:29:33.082860465 +0000 UTC m=+0.046706715 container died 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:33 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:29:33 np0005592767 kernel: tap63663dd8-08: entered promiscuous mode
Jan 22 17:29:33 np0005592767 NetworkManager[54973]: <info>  [1769120973.1157] manager: (tap63663dd8-08): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Jan 22 17:29:33 np0005592767 systemd-udevd[221688]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00271|binding|INFO|Claiming lport 63663dd8-0844-4e65-9378-64416c0b1178 for this chassis.
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00272|binding|INFO|63663dd8-0844-4e65-9378-64416c0b1178: Claiming fa:16:3e:01:b1:02 10.100.0.8
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.126 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:33 np0005592767 systemd[1]: var-lib-containers-storage-overlay-5696c857799fc76577715c22d2ffbb075004b0dc9f898f77c353155aac7f7ba9-merged.mount: Deactivated successfully.
Jan 22 17:29:33 np0005592767 kernel: tap63663dd8-08 (unregistering): left promiscuous mode
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: hostname: compute-2
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00273|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 ovn-installed in OVS
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00274|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 up in Southbound
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.136 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00275|binding|INFO|Releasing lport 63663dd8-0844-4e65-9378-64416c0b1178 from this chassis (sb_readonly=1)
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.138 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00276|binding|INFO|Removing iface tap63663dd8-08 ovn-installed in OVS
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00277|if_status|INFO|Not setting lport 63663dd8-0844-4e65-9378-64416c0b1178 down as sb is readonly
Jan 22 17:29:33 np0005592767 podman[221708]: 2026-01-22 22:29:33.140272092 +0000 UTC m=+0.104118342 container cleanup 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00278|binding|INFO|Releasing lport 63663dd8-0844-4e65-9378-64416c0b1178 from this chassis (sb_readonly=0)
Jan 22 17:29:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:33Z|00279|binding|INFO|Setting lport 63663dd8-0844-4e65-9378-64416c0b1178 down in Southbound
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.142 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 systemd[1]: libpod-conmon-06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d.scope: Deactivated successfully.
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.155 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b1:02 10.100.0.8'], port_security=['fa:16:3e:01:b1:02 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8c343772-6b41-4817-ab66-4bb05c591cc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=63663dd8-0844-4e65-9378-64416c0b1178) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 virtnodedevd[182364]: ethtool ioctl error on tap63663dd8-08: No such device
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.175 182627 INFO nova.virt.libvirt.driver [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Instance destroyed successfully.#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.176 182627 DEBUG nova.objects.instance [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 8c343772-6b41-4817-ab66-4bb05c591cc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.192 182627 DEBUG nova.virt.libvirt.vif [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1953303378',display_name='tempest-ServerDiskConfigTestJSON-server-1953303378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1953303378',id=73,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-zx4ant69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:29:31Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=8c343772-6b41-4817-ab66-4bb05c591cc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.193 182627 DEBUG nova.network.os_vif_util [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "63663dd8-0844-4e65-9378-64416c0b1178", "address": "fa:16:3e:01:b1:02", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63663dd8-08", "ovs_interfaceid": "63663dd8-0844-4e65-9378-64416c0b1178", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.194 182627 DEBUG nova.network.os_vif_util [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.194 182627 DEBUG os_vif [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.196 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.196 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63663dd8-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.197 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.198 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.200 182627 INFO os_vif [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:b1:02,bridge_name='br-int',has_traffic_filtering=True,id=63663dd8-0844-4e65-9378-64416c0b1178,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63663dd8-08')#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.201 182627 INFO nova.virt.libvirt.driver [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Deleting instance files /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0_del#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.201 182627 INFO nova.virt.libvirt.driver [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Deletion of /var/lib/nova/instances/8c343772-6b41-4817-ab66-4bb05c591cc0_del complete#033[00m
Jan 22 17:29:33 np0005592767 podman[221754]: 2026-01-22 22:29:33.21184966 +0000 UTC m=+0.047253890 container remove 06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.216 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb30106-3106-40a7-8bf7-87f0ceedf7e5]: (4, ('Thu Jan 22 10:29:33 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d)\n06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d\nThu Jan 22 10:29:33 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d)\n06236a93038ba32b8ccc2dcf23307d751a5d0af16878a00362ecefbe5fb0743d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.218 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75e8f89b-7015-4a3d-ba9f-71194443bd66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.219 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.220 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.225 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0db93bb0-b36a-4d80-bfc4-f6cc5bbffdc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.232 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.247 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[53e96eee-b072-486a-9a53-226373c3edfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.249 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e52997-e6ab-415b-b571-c8376f9da214]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.266 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[317796c8-af7c-495b-85a3-8d3ae9e33500]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456725, 'reachable_time': 19497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221781, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.268 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.268 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[1467049a-b818-42cf-bbc3-12b2f6360068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.269 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:29:33 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.270 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.271 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75f6d0fa-6967-448a-8504-9d26a88d95f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.271 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 63663dd8-0844-4e65-9378-64416c0b1178 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.273 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:29:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:33.273 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[08e7ccfc-1e3b-405f-a452-84e9677adc3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.275 182627 INFO nova.compute.manager [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.275 182627 DEBUG oslo.service.loopingcall [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.275 182627 DEBUG nova.compute.manager [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:29:33 np0005592767 nova_compute[182623]: 2026-01-22 22:29:33.276 182627 DEBUG nova.network.neutron [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.078 182627 DEBUG nova.network.neutron [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.095 182627 INFO nova.compute.manager [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Took 0.82 seconds to deallocate network for instance.#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.185 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.185 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.222 182627 DEBUG nova.compute.manager [req-fb27ee17-c478-40bf-a725-49ea55daad85 req-48b23e90-5888-44db-80be-2dfedf2fe58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-deleted-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.246 182627 DEBUG nova.compute.provider_tree [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.264 182627 DEBUG nova.scheduler.client.report [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.296 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.320 182627 INFO nova.scheduler.client.report [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocations for instance 8c343772-6b41-4817-ab66-4bb05c591cc0#033[00m
Jan 22 17:29:34 np0005592767 nova_compute[182623]: 2026-01-22 22:29:34.395 182627 DEBUG oslo_concurrency.lockutils [None req-032d2b95-8e81-4041-b8f4-7434db819405 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.072 182627 DEBUG nova.compute.manager [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.072 182627 DEBUG oslo_concurrency.lockutils [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.073 182627 DEBUG oslo_concurrency.lockutils [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.073 182627 DEBUG oslo_concurrency.lockutils [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8c343772-6b41-4817-ab66-4bb05c591cc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.073 182627 DEBUG nova.compute.manager [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] No waiting events found dispatching network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.073 182627 WARNING nova.compute.manager [req-098dc895-c11d-43c8-b319-f5625c26d5b0 req-ea77c6f4-b12c-4b23-b864-5390a571dd54 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Received unexpected event network-vif-plugged-63663dd8-0844-4e65-9378-64416c0b1178 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.645 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.646 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.661 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.754 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.754 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.761 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.762 182627 INFO nova.compute.claims [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.889 182627 DEBUG nova.compute.provider_tree [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.903 182627 DEBUG nova.scheduler.client.report [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.928 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.929 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.988 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:29:35 np0005592767 nova_compute[182623]: 2026-01-22 22:29:35.988 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.016 182627 INFO nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.038 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.164 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.166 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.167 182627 INFO nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating image(s)#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.169 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.169 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.170 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.200 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.270 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.272 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.274 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.303 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.340 182627 DEBUG nova.policy [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.395 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.396 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.438 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.441 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.442 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.512 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.515 182627 DEBUG nova.virt.disk.api [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.516 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.580 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.582 182627 DEBUG nova.virt.disk.api [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.583 182627 DEBUG nova.objects.instance [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.601 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.602 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Ensure instance console log exists: /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.603 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.603 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:36 np0005592767 nova_compute[182623]: 2026-01-22 22:29:36.604 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:37 np0005592767 nova_compute[182623]: 2026-01-22 22:29:37.047 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Successfully created port: 60464675-d651-488b-a1aa-832103327e7f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.199 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.203 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Successfully updated port: 60464675-d651-488b-a1aa-832103327e7f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.231 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.231 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.231 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.296 182627 DEBUG nova.compute.manager [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-changed-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.297 182627 DEBUG nova.compute.manager [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Refreshing instance network info cache due to event network-changed-60464675-d651-488b-a1aa-832103327e7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:29:38 np0005592767 nova_compute[182623]: 2026-01-22 22:29:38.297 182627 DEBUG oslo_concurrency.lockutils [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:29:39 np0005592767 podman[221797]: 2026-01-22 22:29:39.176825862 +0000 UTC m=+0.098481382 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:29:39 np0005592767 nova_compute[182623]: 2026-01-22 22:29:39.281 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.644 182627 DEBUG nova.network.neutron [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Updating instance_info_cache with network_info: [{"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.668 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.669 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance network_info: |[{"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.669 182627 DEBUG oslo_concurrency.lockutils [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.669 182627 DEBUG nova.network.neutron [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Refreshing network info cache for port 60464675-d651-488b-a1aa-832103327e7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.673 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start _get_guest_xml network_info=[{"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.678 182627 WARNING nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.698 182627 DEBUG nova.virt.libvirt.host [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.701 182627 DEBUG nova.virt.libvirt.host [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.706 182627 DEBUG nova.virt.libvirt.host [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.707 182627 DEBUG nova.virt.libvirt.host [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.709 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.709 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.710 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.710 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.710 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.711 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.711 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.711 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.711 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.711 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.712 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.712 182627 DEBUG nova.virt.hardware [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.717 182627 DEBUG nova.virt.libvirt.vif [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:36Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.718 182627 DEBUG nova.network.os_vif_util [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.718 182627 DEBUG nova.network.os_vif_util [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.720 182627 DEBUG nova.objects.instance [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.735 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <uuid>c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</uuid>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <name>instance-0000004d</name>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1748424325</nova:name>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:29:40</nova:creationTime>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        <nova:port uuid="60464675-d651-488b-a1aa-832103327e7f">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="serial">c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="uuid">c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:db:51:33"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <target dev="tap60464675-d6"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/console.log" append="off"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:29:40 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:29:40 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:29:40 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:29:40 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.736 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Preparing to wait for external event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.737 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.737 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.738 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.739 182627 DEBUG nova.virt.libvirt.vif [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:36Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.739 182627 DEBUG nova.network.os_vif_util [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.740 182627 DEBUG nova.network.os_vif_util [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.740 182627 DEBUG os_vif [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.741 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.741 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.742 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.746 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60464675-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.747 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60464675-d6, col_values=(('external_ids', {'iface-id': '60464675-d651-488b-a1aa-832103327e7f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:51:33', 'vm-uuid': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.749 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:40 np0005592767 NetworkManager[54973]: <info>  [1769120980.7507] manager: (tap60464675-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.751 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.754 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.755 182627 INFO os_vif [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6')#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.807 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.807 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.808 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:db:51:33, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:29:40 np0005592767 nova_compute[182623]: 2026-01-22 22:29:40.808 182627 INFO nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Using config drive#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.266 182627 INFO nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating config drive at /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.277 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxdk6wu7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.404 182627 DEBUG oslo_concurrency.processutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxdk6wu7" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:29:41 np0005592767 kernel: tap60464675-d6: entered promiscuous mode
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.4942] manager: (tap60464675-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.497 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:41Z|00280|binding|INFO|Claiming lport 60464675-d651-488b-a1aa-832103327e7f for this chassis.
Jan 22 17:29:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:41Z|00281|binding|INFO|60464675-d651-488b-a1aa-832103327e7f: Claiming fa:16:3e:db:51:33 10.100.0.11
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.518 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:41Z|00282|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f ovn-installed in OVS
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.519 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.523 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:41Z|00283|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f up in Southbound
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.527 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.528 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.530 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:29:41 np0005592767 systemd-udevd[221836]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.544 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4902c0-257e-4cdc-9d6e-0ad20fbfb75e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.545 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:29:41 np0005592767 systemd-machined[153912]: New machine qemu-37-instance-0000004d.
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.547 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.547 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f43e9a78-739a-44d8-92d4-14d7ecb467df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.548 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb835b3-3ade-40f3-958c-24d60a4f13c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.5563] device (tap60464675-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.5570] device (tap60464675-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.560 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f23f7df9-ae40-4151-b2af-0f88b635de4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 systemd[1]: Started Virtual Machine qemu-37-instance-0000004d.
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.585 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e188aaff-e16f-42f0-abc7-ac9d7dcd4bc7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.621 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[17c578e4-a94a-44c7-b46a-7f513fd94ca2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.626 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[171c4e9d-a691-40ab-9bdf-ef47e4d188c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.6283] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.657 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc929c9-9ef8-4eac-89d5-c9ee61fa4ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.661 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8784ecc5-dc9c-4d14-a56e-c83f656a9703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.6903] device (tap354683a7-30): carrier: link connected
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.696 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[41d77e69-b0cf-4e3c-9089-3b54f8287f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.712 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd43bf2-70d3-4cfc-857a-604a89efbbe4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457826, 'reachable_time': 28118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221870, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.724 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[04974c14-5538-44c9-b970-440f525ccb88]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457826, 'tstamp': 457826}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221871, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.742 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e40b95ef-f277-4f8f-9acb-54889c352fb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457826, 'reachable_time': 28118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221872, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.777 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ed71aa-98b7-4805-aff3-392cea174f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.854 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf3a97c-8f59-49cb-a8ba-920259999baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.856 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.857 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.858 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:41 np0005592767 NetworkManager[54973]: <info>  [1769120981.8610] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 22 17:29:41 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.860 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.866 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.868 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:41Z|00284|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.870 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.877 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9f82bb-a76c-4bde-9eca-5219ede5573f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:29:41 np0005592767 nova_compute[182623]: 2026-01-22 22:29:41.879 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.879 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:29:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:41.880 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:29:42 np0005592767 podman[221904]: 2026-01-22 22:29:42.23247074 +0000 UTC m=+0.044554763 container create f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:29:42 np0005592767 systemd[1]: Started libpod-conmon-f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96.scope.
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.305 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120982.3035054, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:42 np0005592767 podman[221904]: 2026-01-22 22:29:42.208281035 +0000 UTC m=+0.020365088 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.305 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:29:42 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:29:42 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed1b5148ec203eab79441d3aaaf95e6a4aa56643d1ec4638a19b5f9575a46fae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:29:42 np0005592767 podman[221904]: 2026-01-22 22:29:42.342644772 +0000 UTC m=+0.154728815 container init f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:29:42 np0005592767 podman[221904]: 2026-01-22 22:29:42.347639023 +0000 UTC m=+0.159723056 container start f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.355 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.361 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120982.3053257, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.361 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:29:42 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [NOTICE]   (221930) : New worker (221932) forked
Jan 22 17:29:42 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [NOTICE]   (221930) : Loading success.
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.408 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.414 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:42 np0005592767 nova_compute[182623]: 2026-01-22 22:29:42.449 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.978 182627 DEBUG nova.compute.manager [req-061eb17f-5050-4eca-b934-2628610a4f17 req-39da3563-148a-41db-a574-c02f80055153 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.979 182627 DEBUG oslo_concurrency.lockutils [req-061eb17f-5050-4eca-b934-2628610a4f17 req-39da3563-148a-41db-a574-c02f80055153 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.979 182627 DEBUG oslo_concurrency.lockutils [req-061eb17f-5050-4eca-b934-2628610a4f17 req-39da3563-148a-41db-a574-c02f80055153 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.979 182627 DEBUG oslo_concurrency.lockutils [req-061eb17f-5050-4eca-b934-2628610a4f17 req-39da3563-148a-41db-a574-c02f80055153 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.979 182627 DEBUG nova.compute.manager [req-061eb17f-5050-4eca-b934-2628610a4f17 req-39da3563-148a-41db-a574-c02f80055153 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Processing event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.980 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.984 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769120983.9840176, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.984 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.985 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.990 182627 INFO nova.virt.libvirt.driver [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance spawned successfully.#033[00m
Jan 22 17:29:43 np0005592767 nova_compute[182623]: 2026-01-22 22:29:43.990 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.012 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.014 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.014 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.014 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.015 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.015 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.016 182627 DEBUG nova.virt.libvirt.driver [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.020 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.066 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.112 182627 INFO nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Took 7.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.113 182627 DEBUG nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:44 np0005592767 podman[221942]: 2026-01-22 22:29:44.177407018 +0000 UTC m=+0.074120471 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:29:44 np0005592767 podman[221941]: 2026-01-22 22:29:44.199833704 +0000 UTC m=+0.103462643 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.214 182627 DEBUG nova.network.neutron [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Updated VIF entry in instance network info cache for port 60464675-d651-488b-a1aa-832103327e7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.214 182627 DEBUG nova.network.neutron [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Updating instance_info_cache with network_info: [{"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.215 182627 INFO nova.compute.manager [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Took 8.49 seconds to build instance.#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.242 182627 DEBUG oslo_concurrency.lockutils [req-f2179023-4f6c-4949-a284-ca7f6540cfc8 req-98c4ebc9-8d54-4a27-a2e5-fa569b5400cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:29:44 np0005592767 nova_compute[182623]: 2026-01-22 22:29:44.256 182627 DEBUG oslo_concurrency.lockutils [None req-324e160d-f333-48ec-b4cb-29c06e7e22d7 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:45 np0005592767 nova_compute[182623]: 2026-01-22 22:29:45.750 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:45 np0005592767 nova_compute[182623]: 2026-01-22 22:29:45.785 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.164 182627 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.165 182627 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.165 182627 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.165 182627 DEBUG oslo_concurrency.lockutils [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.166 182627 DEBUG nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:29:46 np0005592767 nova_compute[182623]: 2026-01-22 22:29:46.166 182627 WARNING nova.compute.manager [req-19251796-3551-498d-81e8-19184245e3f3 req-681337d6-85af-46d7-ae7a-3da187bb8708 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received unexpected event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f for instance with vm_state active and task_state None.#033[00m
Jan 22 17:29:48 np0005592767 nova_compute[182623]: 2026-01-22 22:29:48.174 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769120973.1713495, 8c343772-6b41-4817-ab66-4bb05c591cc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:29:48 np0005592767 nova_compute[182623]: 2026-01-22 22:29:48.174 182627 INFO nova.compute.manager [-] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:29:48 np0005592767 nova_compute[182623]: 2026-01-22 22:29:48.201 182627 DEBUG nova.compute.manager [None req-368d72ba-f25d-4245-b439-9a103d2c6d40 - - - - - -] [instance: 8c343772-6b41-4817-ab66-4bb05c591cc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.036 182627 INFO nova.compute.manager [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Rebuilding instance#033[00m
Jan 22 17:29:50 np0005592767 podman[221989]: 2026-01-22 22:29:50.131013298 +0000 UTC m=+0.046133268 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:29:50 np0005592767 podman[221988]: 2026-01-22 22:29:50.15295881 +0000 UTC m=+0.073329439 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.325 182627 DEBUG nova.compute.manager [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.404 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_requests' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.424 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.452 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.462 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:50.461 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:29:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:50.463 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.470 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.488 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.491 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.751 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:50 np0005592767 nova_compute[182623]: 2026-01-22 22:29:50.788 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:55 np0005592767 nova_compute[182623]: 2026-01-22 22:29:55.753 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:55 np0005592767 nova_compute[182623]: 2026-01-22 22:29:55.790 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:29:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:55Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:51:33 10.100.0.11
Jan 22 17:29:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:29:55Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:51:33 10.100.0.11
Jan 22 17:29:56 np0005592767 podman[222045]: 2026-01-22 22:29:56.158330466 +0000 UTC m=+0.068263095 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:29:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:29:56.465 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:00 np0005592767 nova_compute[182623]: 2026-01-22 22:30:00.540 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:30:00 np0005592767 nova_compute[182623]: 2026-01-22 22:30:00.756 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:00 np0005592767 nova_compute[182623]: 2026-01-22 22:30:00.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.645 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.671 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Triggering sync for uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.672 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.673 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.674 182627 INFO nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] During sync_power_state the instance has a pending task (rebuilding). Skip.#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.674 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:02 np0005592767 kernel: tap60464675-d6 (unregistering): left promiscuous mode
Jan 22 17:30:02 np0005592767 NetworkManager[54973]: <info>  [1769121002.7594] device (tap60464675-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:02Z|00285|binding|INFO|Releasing lport 60464675-d651-488b-a1aa-832103327e7f from this chassis (sb_readonly=0)
Jan 22 17:30:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:02Z|00286|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f down in Southbound
Jan 22 17:30:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:02Z|00287|binding|INFO|Removing iface tap60464675-d6 ovn-installed in OVS
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:02 np0005592767 nova_compute[182623]: 2026-01-22 22:30:02.790 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:02.792 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:02.793 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:30:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:02.795 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:02.796 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f646864c-e8dc-4955-a2a7-9042be3e9955]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:02.796 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:30:02 np0005592767 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 22 17:30:02 np0005592767 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004d.scope: Consumed 13.327s CPU time.
Jan 22 17:30:02 np0005592767 systemd-machined[153912]: Machine qemu-37-instance-0000004d terminated.
Jan 22 17:30:02 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [NOTICE]   (221930) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:02 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [NOTICE]   (221930) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:02 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [WARNING]  (221930) : Exiting Master process...
Jan 22 17:30:02 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [ALERT]    (221930) : Current worker (221932) exited with code 143 (Terminated)
Jan 22 17:30:02 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[221926]: [WARNING]  (221930) : All workers exited. Exiting... (0)
Jan 22 17:30:02 np0005592767 systemd[1]: libpod-f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96.scope: Deactivated successfully.
Jan 22 17:30:02 np0005592767 podman[222092]: 2026-01-22 22:30:02.956028615 +0000 UTC m=+0.044117461 container died f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:30:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay-ed1b5148ec203eab79441d3aaaf95e6a4aa56643d1ec4638a19b5f9575a46fae-merged.mount: Deactivated successfully.
Jan 22 17:30:03 np0005592767 podman[222092]: 2026-01-22 22:30:03.001798872 +0000 UTC m=+0.089887708 container cleanup f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 17:30:03 np0005592767 systemd[1]: libpod-conmon-f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96.scope: Deactivated successfully.
Jan 22 17:30:03 np0005592767 podman[222135]: 2026-01-22 22:30:03.066389132 +0000 UTC m=+0.040718715 container remove f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8ece4381-959e-49fd-9790-208262158d10]: (4, ('Thu Jan 22 10:30:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96)\nf86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96\nThu Jan 22 10:30:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (f86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96)\nf86f0abdb9340391db4e7196323ee6607f01597df91d0231fcd2950428aeba96\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.073 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[21c84ae6-8245-4460-8c5a-cd7fdf39a109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.074 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.090 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.092 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.094 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[63c4296e-1338-4f2e-a604-2439ed081baa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.110 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb29f857-5116-4e72-afae-309a98629100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.112 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[218ef02c-135a-4169-a662-e6f677e8be75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.127 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[efccd71c-53a5-4b9d-b357-29e89d75f0a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457819, 'reachable_time': 43154, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222159, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.129 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:30:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:03.129 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[cba714f4-6682-4e20-be21-73f5ed414076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:03 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.520 182627 DEBUG nova.compute.manager [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.521 182627 DEBUG oslo_concurrency.lockutils [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.522 182627 DEBUG oslo_concurrency.lockutils [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.522 182627 DEBUG oslo_concurrency.lockutils [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.523 182627 DEBUG nova.compute.manager [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.524 182627 WARNING nova.compute.manager [req-745f1349-8e8c-418b-bd21-5ea3cdf314a9 req-d541495d-6103-4b2f-8c3e-bd6c0062a9ce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received unexpected event network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f for instance with vm_state active and task_state rebuilding.#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.556 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.562 182627 INFO nova.virt.libvirt.driver [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance destroyed successfully.#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.566 182627 INFO nova.virt.libvirt.driver [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance destroyed successfully.#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.567 182627 DEBUG nova.virt.libvirt.vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:29:49Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.568 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.569 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.569 182627 DEBUG os_vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.571 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.571 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60464675-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.572 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.574 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.576 182627 INFO os_vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6')#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.577 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Deleting instance files /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3_del#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.578 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Deletion of /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3_del complete#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.870 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.871 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating image(s)#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.872 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.872 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.873 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.899 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.962 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.963 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.964 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:03 np0005592767 nova_compute[182623]: 2026-01-22 22:30:03.983 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.041 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.043 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.093 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.095 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.096 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.166 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.168 182627 DEBUG nova.virt.disk.api [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.169 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.231 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.232 182627 DEBUG nova.virt.disk.api [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.233 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.234 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Ensure instance console log exists: /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.234 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.235 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.236 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.240 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start _get_guest_xml network_info=[{"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.245 182627 WARNING nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.253 182627 DEBUG nova.virt.libvirt.host [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.254 182627 DEBUG nova.virt.libvirt.host [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.257 182627 DEBUG nova.virt.libvirt.host [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.258 182627 DEBUG nova.virt.libvirt.host [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.259 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.259 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.260 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.260 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.261 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.261 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.261 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.261 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.262 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.262 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.262 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.262 182627 DEBUG nova.virt.hardware [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.263 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.283 182627 DEBUG nova.virt.libvirt.vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:03Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.284 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.285 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.287 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <uuid>c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</uuid>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <name>instance-0000004d</name>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1748424325</nova:name>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:30:04</nova:creationTime>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        <nova:port uuid="60464675-d651-488b-a1aa-832103327e7f">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="serial">c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="uuid">c98a3885-eda6-4fd8-a2c3-73b2a825cbd3</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:db:51:33"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <target dev="tap60464675-d6"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/console.log" append="off"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:30:04 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:30:04 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:30:04 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:30:04 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.288 182627 DEBUG nova.compute.manager [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Preparing to wait for external event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.289 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.289 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.290 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.290 182627 DEBUG nova.virt.libvirt.vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:29:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:03Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.291 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.291 182627 DEBUG nova.network.os_vif_util [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.292 182627 DEBUG os_vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.292 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.293 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.293 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.297 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60464675-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.297 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60464675-d6, col_values=(('external_ids', {'iface-id': '60464675-d651-488b-a1aa-832103327e7f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:51:33', 'vm-uuid': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.299 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:04 np0005592767 NetworkManager[54973]: <info>  [1769121004.3001] manager: (tap60464675-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.304 182627 INFO os_vif [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6')#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.360 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.361 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.361 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:db:51:33, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.362 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Using config drive#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.376 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:04 np0005592767 nova_compute[182623]: 2026-01-22 22:30:04.419 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'keypairs' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.539 182627 INFO nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Creating config drive at /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.544 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28d0_5nc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.599 182627 DEBUG nova.compute.manager [req-e4ef7325-5ede-4649-bb70-42537e7e17d8 req-e4b4d0e6-169c-4587-8304-8623cd4ea067 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.600 182627 DEBUG oslo_concurrency.lockutils [req-e4ef7325-5ede-4649-bb70-42537e7e17d8 req-e4b4d0e6-169c-4587-8304-8623cd4ea067 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.600 182627 DEBUG oslo_concurrency.lockutils [req-e4ef7325-5ede-4649-bb70-42537e7e17d8 req-e4b4d0e6-169c-4587-8304-8623cd4ea067 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.600 182627 DEBUG oslo_concurrency.lockutils [req-e4ef7325-5ede-4649-bb70-42537e7e17d8 req-e4b4d0e6-169c-4587-8304-8623cd4ea067 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.601 182627 DEBUG nova.compute.manager [req-e4ef7325-5ede-4649-bb70-42537e7e17d8 req-e4b4d0e6-169c-4587-8304-8623cd4ea067 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Processing event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.666 182627 DEBUG oslo_concurrency.processutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28d0_5nc" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:05 np0005592767 kernel: tap60464675-d6: entered promiscuous mode
Jan 22 17:30:05 np0005592767 NetworkManager[54973]: <info>  [1769121005.7200] manager: (tap60464675-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 22 17:30:05 np0005592767 systemd-udevd[222073]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:05Z|00288|binding|INFO|Claiming lport 60464675-d651-488b-a1aa-832103327e7f for this chassis.
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.723 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:05Z|00289|binding|INFO|60464675-d651-488b-a1aa-832103327e7f: Claiming fa:16:3e:db:51:33 10.100.0.11
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.730 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.731 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.732 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:30:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:05Z|00290|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f ovn-installed in OVS
Jan 22 17:30:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:05Z|00291|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f up in Southbound
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:05 np0005592767 NetworkManager[54973]: <info>  [1769121005.7393] device (tap60464675-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:30:05 np0005592767 NetworkManager[54973]: <info>  [1769121005.7399] device (tap60464675-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.743 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.747 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61bd832b-ab2c-43a5-bb48-b5431cb701d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.748 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.749 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.750 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[55215ef7-5ca8-46f9-b9bc-933af1f235a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.750 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf57990-6a03-47a4-9d98-de0dc4d184f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.763 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[5b429e25-ca71-41eb-ba00-2f93ccdabe2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 systemd-machined[153912]: New machine qemu-38-instance-0000004d.
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.775 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2364b3d7-4b80-4da6-b290-e008025e433f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 systemd[1]: Started Virtual Machine qemu-38-instance-0000004d.
Jan 22 17:30:05 np0005592767 nova_compute[182623]: 2026-01-22 22:30:05.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.806 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[42cc558e-87ef-42f3-b14d-472bd64cdb12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.813 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3d015a74-2602-4ea5-b8e8-3b02cdac7356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 NetworkManager[54973]: <info>  [1769121005.8144] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Jan 22 17:30:05 np0005592767 systemd-udevd[222208]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.857 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0051ade3-fadc-4d09-94ad-244d609a8a25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.861 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7ffda0-d448-440b-80c6-0c78624e6fd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 NetworkManager[54973]: <info>  [1769121005.8790] device (tap354683a7-30): carrier: link connected
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.882 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1d80b06a-6bc0-48d6-8e37-7fe0cb56b588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.899 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5c699a-4c4d-44f6-9c03-b7b0e2753159]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460245, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222229, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.916 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f0268e02-5166-4ce0-959d-59004ab90136]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460245, 'tstamp': 460245}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222230, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.937 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f18a76af-b55a-4d8e-94bf-da5272a0d2c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460245, 'reachable_time': 17840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222231, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:05.974 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb18b1f-cecc-4071-a4d5-d66985aeef1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.023 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[186dca17-6978-421b-9ac0-6500fdadbb6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.025 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.026 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.027 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.029 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:06 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.033 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:06 np0005592767 NetworkManager[54973]: <info>  [1769121006.0334] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.034 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:06Z|00292|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.060 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.061 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.062 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6b290d-b9fe-4b4d-8992-06e7dab41e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.063 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:30:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:06.065 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.195 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.196 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121006.1949286, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.197 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.199 182627 DEBUG nova.compute.manager [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.203 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.207 182627 INFO nova.virt.libvirt.driver [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance spawned successfully.#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.208 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.232 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.240 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.246 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.246 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.247 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.247 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.248 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.248 182627 DEBUG nova.virt.libvirt.driver [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.280 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.281 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121006.1952899, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.282 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.315 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.326 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121006.2021525, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.326 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.330 182627 DEBUG nova.compute.manager [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.354 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.358 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.376 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.417 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.418 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.418 182627 DEBUG nova.objects.instance [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:30:06 np0005592767 podman[222268]: 2026-01-22 22:30:06.461726875 +0000 UTC m=+0.050477371 container create 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:06 np0005592767 nova_compute[182623]: 2026-01-22 22:30:06.482 182627 DEBUG oslo_concurrency.lockutils [None req-34f294d3-5bbf-462c-b619-76cb00fdb4d5 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:06 np0005592767 systemd[1]: Started libpod-conmon-536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785.scope.
Jan 22 17:30:06 np0005592767 podman[222268]: 2026-01-22 22:30:06.43472552 +0000 UTC m=+0.023476036 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:30:06 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:30:06 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4644bf74ae3a90e347677a7dc0be075bd146c413a21b677974f3cc58a0e89f9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:30:06 np0005592767 podman[222268]: 2026-01-22 22:30:06.567602415 +0000 UTC m=+0.156352931 container init 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:30:06 np0005592767 podman[222268]: 2026-01-22 22:30:06.574688246 +0000 UTC m=+0.163438742 container start 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 17:30:06 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [NOTICE]   (222286) : New worker (222288) forked
Jan 22 17:30:06 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [NOTICE]   (222286) : Loading success.
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.324 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '708eb5a130224bd188eae5ec27c67df5', 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'hostId': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.338 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.340 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd489642e-0996-4dd8-b4ea-26add4e222e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.327656', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e79391c2-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': '4273c8ce30f8f07c6653b009236743ea36ad801d35807b17f3a6965ea53eb973'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.327656', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e793c250-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': '2fdad98ff3e8cab3d9a0fe7d4dfe5d29e380df964780052e370a8e72fefebc10'}]}, 'timestamp': '2026-01-22 22:30:07.341127', '_unique_id': 'bea809bd88414800bc3c1365895bf825'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.379 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.380 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8a5b41e-0bac-43f3-8fd8-08546419e16e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.352772', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e799cee8-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': '3e203ef45715ed7a2f3bfb3c5f3fdf69ed73e3d8671ff1993a26a2633c02ce2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.352772', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e799e112-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': '518ea4807ac092227ff4feb67d0334ff8fff26aa3179488702936b068c7e785e'}]}, 'timestamp': '2026-01-22 22:30:07.381035', '_unique_id': 'a053480c41e4428f896ef4daa4f1840a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.387 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 / tap60464675-d6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.388 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e860d9ac-dbda-44ca-b78e-40447d172dbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.385937', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e79b0858-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '27a82c2989dbe1a64de862aa83d134078a50c84ce95012185f4d0d92aa57b6a7'}]}, 'timestamp': '2026-01-22 22:30:07.388624', '_unique_id': 'b39be713540a4e61a0758d5bdca43498'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.392 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.392 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3837095e-4f28-4c79-9b10-080dd51546f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.392707', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e79bb83e-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '5a5160af10adb766956380656c81916ee95fd4a20a3277968c531c3cf9d76d5a'}]}, 'timestamp': '2026-01-22 22:30:07.393133', '_unique_id': 'b64fd4e94b3d4f9b981850ebfdd9c69b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.396 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.410 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.410 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c98a3885-eda6-4fd8-a2c3-73b2a825cbd3: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.410 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>]
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.411 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.411 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.412 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d0a8d09-5975-4e2e-9163-3bfb8ab7fe2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.411697', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e79e9d06-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': '1ff2ae77d0fcb720dcd09154883f396d53e64ec320d35555cd87bf0873dd704f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.411697', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e79eac06-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'a3b9c0897033f29558db3c0e6a8ced5cd13add77be7a031e5caa867cf4f7592a'}]}, 'timestamp': '2026-01-22 22:30:07.412438', '_unique_id': 'f387508953e54659b9e9f3aa1d9183c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.416 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.416 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec54199e-c8a4-42dd-a036-88c7088eacd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.416170', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e79f4c6a-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'af6542c9995700cc341579bc83b3cefb74cbdeb9a785dd0048af8e29b260a553'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.416170', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e79f5886-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'd375528bd5ee4f2e543c45c334fccd57403219d06897ea3c4ec1375d0dbe3805'}]}, 'timestamp': '2026-01-22 22:30:07.416845', '_unique_id': '8b43f6fc39e940f59631d9e78a272bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.420 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bf35923-67fc-4694-a5fb-7410a954267e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.420366', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e79fef76-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '265c4e4269c21a751339457423664d36c67040aa41f899ffee272a91917506c5'}]}, 'timestamp': '2026-01-22 22:30:07.420751', '_unique_id': 'e3bf8859b1db48d6917e3f7456fae886'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.423 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.424 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.424 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>]
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.424 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97fc8932-bde5-4da2-83f2-802328008675', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.424798', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e7a09bb0-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': '3480e739ebd5a21ffb129d76d90fa80268a6cd16344fa68239814f6d823e8136'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.424798', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e7a0a966-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': 'f60768f33c487b88deeb0295271c44804914f1a582322a7bb1b61185d467d464'}]}, 'timestamp': '2026-01-22 22:30:07.425480', '_unique_id': '83c2f8d401bb4689a25248ec2ce8e42b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.429 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.latency volume: 127546657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.429 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.latency volume: 363370 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e9a9b4d-7839-4858-9b5d-30139a4d09d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 127546657, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.429119', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e7a14632-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': '2d4f37035649823c50ade2b8eb4b374926e58e274ab7c11ae1df61396dc30303'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 363370, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.429119', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e7a152f8-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'ca7573f5240e0c916a8a299aa6e8a4e355edaee2751e585b810314a652f2dfb4'}]}, 'timestamp': '2026-01-22 22:30:07.429809', '_unique_id': 'ccc85341efd3435c88339366cfdc7473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.433 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.433 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.434 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57be7ac3-7685-4a8f-b898-3462772412d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.433942', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e7a2019e-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'd383a6f7ee1f621721788efecd63f9c6340620cc344883ec96a089c82966cbf7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.433942', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e7a20f90-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'a14418dfe2645223d4170591572080fea7d26c39c825900eba4bf660464c4fff'}]}, 'timestamp': '2026-01-22 22:30:07.434640', '_unique_id': '49d85edadcec44448d7023e1b9a4e0c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.438 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.438 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49be9c86-5689-4598-bb41-60fd7935141a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.438204', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a2aa22-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': 'f6236784e66fc58b9a3094db096a50c8933eb82e5185c71775fb98b57749312e'}]}, 'timestamp': '2026-01-22 22:30:07.438606', '_unique_id': '0bcc7ca0415048fba4005f45addba6c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b219fb87-3acd-45c3-abeb-475ea902f98d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.442086', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a34004-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': 'a69a3834e363393af86b4971624f3c3ce2514139ba9361de75135a1af01e5dd8'}]}, 'timestamp': '2026-01-22 22:30:07.442497', '_unique_id': '64e6dc8c62b244f895b081e3a2bfb624'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.445 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.445 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.445 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f363116-3e57-4b60-97dd-c336f8659263', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.445615', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e7a3c812-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': '3d95d737c39c021e5303b8300a1bc18a507305578dba5715d0b775a11956250d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.445615', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e7a3d366-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.987686096, 'message_signature': 'f9431e89cea271c8db80d3df00e328f033d9c826b500f4908b8d0fd7ea62d720'}]}, 'timestamp': '2026-01-22 22:30:07.446247', '_unique_id': '906696f61c564d83a8de66a6cff6f0c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.447 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01bfa8f-6873-497f-89c9-8a85fd833e9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-vda', 'timestamp': '2026-01-22T22:30:07.447861', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e7a41eac-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': '042874c8e1d04799bb4e41c0e047fa4511570ad9cede41f08809a884a26ed4c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-sda', 'timestamp': '2026-01-22T22:30:07.447861', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e7a4291a-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4603.962611065, 'message_signature': 'fad4cea8d1fe9b4ff29bcea7a4cc95b29ab556ab088921c22b14c3d6537e67f4'}]}, 'timestamp': '2026-01-22 22:30:07.448404', '_unique_id': '3cc0a1b51f7b42e6915615575486845a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.449 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.449 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/cpu volume: 1140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c8ef5f3-c93e-4100-bd3b-7cc562cacd3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1140000000, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'timestamp': '2026-01-22T22:30:07.449928', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'instance-0000004d', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e7a470be-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.045041871, 'message_signature': '6af1ba358d5a8f9580223bef01fcb212b28f78543b6e7381a0cef3590a4159b7'}]}, 'timestamp': '2026-01-22 22:30:07.450267', '_unique_id': '5768946cd44249f186f5d4c0720a1dad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.451 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13ce0a66-033a-48c6-92f1-dd536ba95b66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.451790', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a4b7ea-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': 'cdeb44743d6f0ca673551bfc6dcb469ad3aa092c4023a9469d5488e7578ef90a'}]}, 'timestamp': '2026-01-22 22:30:07.452067', '_unique_id': '5e274945ac8d48798ae362d9d7986535'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>]
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d2bc8f-39b6-4b4c-bef4-ba2958b31402', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.453997', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a50dda-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '8a4225e3d5964c496d5b517393f2e7647a58983e1ec08d33c98b039a618e7f0d'}]}, 'timestamp': '2026-01-22 22:30:07.454283', '_unique_id': '3e0dae5fd8fb4d18b5299e827e2dda76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.454 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.455 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.455 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1748424325>]
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.456 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b481b3a4-e7f8-4629-b9f4-32a6aa0b86c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.456243', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a56622-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '7bd76c19db59feb4dadab30b4c3dba0aa92c1f5128dbf10927b6dbd360d8bbea'}]}, 'timestamp': '2026-01-22 22:30:07.456540', '_unique_id': '2bb2124e07084b7b891e420595dc7a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bd9db11-b35a-46e3-8da8-9d24eece8b21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.458053', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a5acea-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': 'c2c7538c25d3d505b9921447fef42f6fe5c48d4e1e371a8170dc334b44d1f751'}]}, 'timestamp': '2026-01-22 22:30:07.458345', '_unique_id': '7d16ebb0286346e49dabe253681f721b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.459 12 DEBUG ceilometer.compute.pollsters [-] c98a3885-eda6-4fd8-a2c3-73b2a825cbd3/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad464af5-b0c9-435f-a902-d68764f7da2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_name': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_name': None, 'resource_id': 'instance-0000004d-c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-tap60464675-d6', 'timestamp': '2026-01-22T22:30:07.459843', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1748424325', 'name': 'tap60464675-d6', 'instance_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'instance_type': 'm1.nano', 'host': '763ec8f25784c3c93dce03bf49235ddf21518e55bb17fcab40a62ed3', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:51:33', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60464675-d6'}, 'message_id': 'e7a5f24a-f7e1-11f0-a43a-fa163ed01feb', 'monotonic_time': 4604.020824035, 'message_signature': '5c0f8505435a262d5c01aa35d7e9854170ab1912fc42fd6a3b1683aab84f0e26'}]}, 'timestamp': '2026-01-22 22:30:07.460114', '_unique_id': '173e264a6ae84b039694c5ed0b48ae82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:30:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:30:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.755 182627 DEBUG nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.755 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.756 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.756 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.757 182627 DEBUG nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.757 182627 WARNING nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received unexpected event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f for instance with vm_state active and task_state None.#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.758 182627 DEBUG nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.759 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.760 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.760 182627 DEBUG oslo_concurrency.lockutils [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.761 182627 DEBUG nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:07 np0005592767 nova_compute[182623]: 2026-01-22 22:30:07.762 182627 WARNING nova.compute.manager [req-905b758a-7fd0-42b8-827f-ecaffda1fd30 req-8899774d-8048-4bd6-ac7b-2fb004c72a62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received unexpected event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f for instance with vm_state active and task_state None.#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.122 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.123 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.123 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.123 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.124 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.134 182627 INFO nova.compute.manager [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Terminating instance#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.143 182627 DEBUG nova.compute.manager [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:30:09 np0005592767 kernel: tap60464675-d6 (unregistering): left promiscuous mode
Jan 22 17:30:09 np0005592767 NetworkManager[54973]: <info>  [1769121009.1704] device (tap60464675-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00293|binding|INFO|Releasing lport 60464675-d651-488b-a1aa-832103327e7f from this chassis (sb_readonly=0)
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00294|binding|INFO|Setting lport 60464675-d651-488b-a1aa-832103327e7f down in Southbound
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00295|binding|INFO|Removing iface tap60464675-d6 ovn-installed in OVS
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.232 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.239 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.241 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.242 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.242 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.243 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17c8bb1c-5274-40b3-8421-13d7040181fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.244 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:30:09 np0005592767 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 22 17:30:09 np0005592767 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004d.scope: Consumed 3.392s CPU time.
Jan 22 17:30:09 np0005592767 systemd-machined[153912]: Machine qemu-38-instance-0000004d terminated.
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.299 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [NOTICE]   (222286) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [NOTICE]   (222286) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [WARNING]  (222286) : Exiting Master process...
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [WARNING]  (222286) : Exiting Master process...
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [ALERT]    (222286) : Current worker (222288) exited with code 143 (Terminated)
Jan 22 17:30:09 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222282]: [WARNING]  (222286) : All workers exited. Exiting... (0)
Jan 22 17:30:09 np0005592767 NetworkManager[54973]: <info>  [1769121009.3648] manager: (tap60464675-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Jan 22 17:30:09 np0005592767 systemd[1]: libpod-536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785.scope: Deactivated successfully.
Jan 22 17:30:09 np0005592767 kernel: tap60464675-d6: entered promiscuous mode
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00296|binding|INFO|Claiming lport 60464675-d651-488b-a1aa-832103327e7f for this chassis.
Jan 22 17:30:09 np0005592767 kernel: tap60464675-d6 (unregistering): left promiscuous mode
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00297|binding|INFO|60464675-d651-488b-a1aa-832103327e7f: Claiming fa:16:3e:db:51:33 10.100.0.11
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.366 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 systemd-udevd[222304]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:09 np0005592767 podman[222300]: 2026-01-22 22:30:09.368873188 +0000 UTC m=+0.090803524 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:30:09 np0005592767 podman[222334]: 2026-01-22 22:30:09.36966133 +0000 UTC m=+0.045986074 container died 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.373 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.386 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:09Z|00298|binding|INFO|Releasing lport 60464675-d651-488b-a1aa-832103327e7f from this chassis (sb_readonly=0)
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.393 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:33 10.100.0.11'], port_security=['fa:16:3e:db:51:33 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c98a3885-eda6-4fd8-a2c3-73b2a825cbd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=60464675-d651-488b-a1aa-832103327e7f) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:09 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:09 np0005592767 systemd[1]: var-lib-containers-storage-overlay-4644bf74ae3a90e347677a7dc0be075bd146c413a21b677974f3cc58a0e89f9b-merged.mount: Deactivated successfully.
Jan 22 17:30:09 np0005592767 podman[222334]: 2026-01-22 22:30:09.417089873 +0000 UTC m=+0.093414617 container cleanup 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.418 182627 INFO nova.virt.libvirt.driver [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Instance destroyed successfully.#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.418 182627 DEBUG nova.objects.instance [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:09 np0005592767 systemd[1]: libpod-conmon-536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785.scope: Deactivated successfully.
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.436 182627 DEBUG nova.virt.libvirt.vif [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:29:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1748424325',display_name='tempest-ServerDiskConfigTestJSON-server-1748424325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1748424325',id=77,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-us2yub10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:06Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=c98a3885-eda6-4fd8-a2c3-73b2a825cbd3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.436 182627 DEBUG nova.network.os_vif_util [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "60464675-d651-488b-a1aa-832103327e7f", "address": "fa:16:3e:db:51:33", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60464675-d6", "ovs_interfaceid": "60464675-d651-488b-a1aa-832103327e7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.437 182627 DEBUG nova.network.os_vif_util [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.437 182627 DEBUG os_vif [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.439 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60464675-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.441 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.442 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.444 182627 INFO os_vif [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:33,bridge_name='br-int',has_traffic_filtering=True,id=60464675-d651-488b-a1aa-832103327e7f,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60464675-d6')#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.445 182627 INFO nova.virt.libvirt.driver [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Deleting instance files /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3_del#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.445 182627 INFO nova.virt.libvirt.driver [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Deletion of /var/lib/nova/instances/c98a3885-eda6-4fd8-a2c3-73b2a825cbd3_del complete#033[00m
Jan 22 17:30:09 np0005592767 podman[222380]: 2026-01-22 22:30:09.488595459 +0000 UTC m=+0.044841651 container remove 536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.493 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[511505f7-5d7c-425a-8a6b-aad2fbe25ae1]: (4, ('Thu Jan 22 10:30:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785)\n536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785\nThu Jan 22 10:30:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785)\n536976cf026e90dc292cec33d9ab5102df6aaf0ea175d254187213e346cfd785\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.495 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[51452225-a84e-4979-9315-5b5673cd6ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.496 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.499 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.509 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.512 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[93ce90c1-f911-47be-8e97-f31d9a76dfef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.528 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[82ffdf5a-07ce-4fba-83c5-11cc00bed1ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.529 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ddae68e-6fc0-4cf7-a45a-2dae765f2cbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.547 182627 INFO nova.compute.manager [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.548 182627 DEBUG oslo.service.loopingcall [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.548 182627 DEBUG nova.compute.manager [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:30:09 np0005592767 nova_compute[182623]: 2026-01-22 22:30:09.549 182627 DEBUG nova.network.neutron [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.557 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c571ac00-7f71-4592-ae7d-e4800816bc5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460238, 'reachable_time': 32801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222396, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.561 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.562 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c22b0f9b-927f-45f6-8371-106c9767cd9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.563 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.565 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.566 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[67f94628-f17b-46c4-bb09-22e35eefd36a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.566 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 60464675-d651-488b-a1aa-832103327e7f in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.568 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:09.568 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3383dc12-ffee-432c-81da-f9ab881ac699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.374 182627 DEBUG nova.compute.manager [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.376 182627 DEBUG oslo_concurrency.lockutils [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.376 182627 DEBUG oslo_concurrency.lockutils [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.376 182627 DEBUG oslo_concurrency.lockutils [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.377 182627 DEBUG nova.compute.manager [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.377 182627 DEBUG nova.compute.manager [req-830de915-92bf-4c2e-a739-20381dc5910c req-904b84f3-b4b6-4c40-bd25-12e5a2cef86b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-unplugged-60464675-d651-488b-a1aa-832103327e7f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.796 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.930 182627 DEBUG nova.network.neutron [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:10 np0005592767 nova_compute[182623]: 2026-01-22 22:30:10.965 182627 INFO nova.compute.manager [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Took 1.42 seconds to deallocate network for instance.#033[00m
Jan 22 17:30:11 np0005592767 nova_compute[182623]: 2026-01-22 22:30:11.056 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:11 np0005592767 nova_compute[182623]: 2026-01-22 22:30:11.056 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:11 np0005592767 nova_compute[182623]: 2026-01-22 22:30:11.124 182627 DEBUG nova.compute.provider_tree [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:11 np0005592767 nova_compute[182623]: 2026-01-22 22:30:11.145 182627 DEBUG nova.scheduler.client.report [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:11 np0005592767 nova_compute[182623]: 2026-01-22 22:30:11.176 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.053 182627 INFO nova.scheduler.client.report [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocations for instance c98a3885-eda6-4fd8-a2c3-73b2a825cbd3#033[00m
Jan 22 17:30:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:12.101 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:12.102 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:12.102 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.177 182627 DEBUG oslo_concurrency.lockutils [None req-8fe642f0-eb1d-40e7-b0e5-cc3905030a1d b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.522 182627 DEBUG nova.compute.manager [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.523 182627 DEBUG oslo_concurrency.lockutils [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.523 182627 DEBUG oslo_concurrency.lockutils [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.523 182627 DEBUG oslo_concurrency.lockutils [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c98a3885-eda6-4fd8-a2c3-73b2a825cbd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.524 182627 DEBUG nova.compute.manager [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] No waiting events found dispatching network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.525 182627 WARNING nova.compute.manager [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received unexpected event network-vif-plugged-60464675-d651-488b-a1aa-832103327e7f for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.525 182627 DEBUG nova.compute.manager [req-3ac3354a-f184-4f97-843d-ed2947f5ee3c req-5b973489-3172-4da9-979f-ded300273f7f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Received event network-vif-deleted-60464675-d651-488b-a1aa-832103327e7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.810 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.811 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.830 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.975 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.975 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.985 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:30:12 np0005592767 nova_compute[182623]: 2026-01-22 22:30:12.986 182627 INFO nova.compute.claims [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.153 182627 DEBUG nova.compute.provider_tree [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.173 182627 DEBUG nova.scheduler.client.report [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.202 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.203 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.290 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.290 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.313 182627 INFO nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.332 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.451 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.453 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.453 182627 INFO nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Creating image(s)#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.453 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.454 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.454 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.467 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.563 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.565 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.566 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.593 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.664 182627 DEBUG nova.policy [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b08cde28781a46649c6528e52d00b1c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '708eb5a130224bd188eae5ec27c67df5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.668 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.669 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.702 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.703 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.703 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.794 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.796 182627 DEBUG nova.virt.disk.api [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.797 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.874 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.876 182627 DEBUG nova.virt.disk.api [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.876 182627 DEBUG nova.objects.instance [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.891 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.891 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Ensure instance console log exists: /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.892 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.893 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.893 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:13 np0005592767 nova_compute[182623]: 2026-01-22 22:30:13.926 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:14 np0005592767 nova_compute[182623]: 2026-01-22 22:30:14.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:15 np0005592767 podman[222413]: 2026-01-22 22:30:15.173311591 +0000 UTC m=+0.077669692 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:30:15 np0005592767 podman[222412]: 2026-01-22 22:30:15.225067128 +0000 UTC m=+0.129423629 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:30:15 np0005592767 nova_compute[182623]: 2026-01-22 22:30:15.573 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Successfully created port: 50b7281e-d0dc-4caf-a920-24203f11da00 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:30:15 np0005592767 nova_compute[182623]: 2026-01-22 22:30:15.798 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:15 np0005592767 nova_compute[182623]: 2026-01-22 22:30:15.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:15 np0005592767 nova_compute[182623]: 2026-01-22 22:30:15.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.109 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Successfully updated port: 50b7281e-d0dc-4caf-a920-24203f11da00 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.133 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.134 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.134 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.300 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.920 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.921 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:30:17 np0005592767 nova_compute[182623]: 2026-01-22 22:30:17.922 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.927 182627 DEBUG nova.network.neutron [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.955 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.956 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.956 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.956 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.983 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.984 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance network_info: |[{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.989 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Start _get_guest_xml network_info=[{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:30:18 np0005592767 nova_compute[182623]: 2026-01-22 22:30:18.995 182627 WARNING nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.051 182627 DEBUG nova.virt.libvirt.host [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.052 182627 DEBUG nova.virt.libvirt.host [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.058 182627 DEBUG nova.virt.libvirt.host [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.060 182627 DEBUG nova.virt.libvirt.host [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.062 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.062 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.064 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.064 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.065 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.065 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.066 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.066 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.067 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.068 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.068 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.069 182627 DEBUG nova.virt.hardware [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.075 182627 DEBUG nova.virt.libvirt.vif [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:13Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.076 182627 DEBUG nova.network.os_vif_util [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.077 182627 DEBUG nova.network.os_vif_util [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.078 182627 DEBUG nova.objects.instance [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.099 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <uuid>5a9390a0-5077-46b6-8f6c-b3b308db8b1d</uuid>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <name>instance-00000051</name>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1146472499</nova:name>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:30:18</nova:creationTime>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        <nova:port uuid="50b7281e-d0dc-4caf-a920-24203f11da00">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="serial">5a9390a0-5077-46b6-8f6c-b3b308db8b1d</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="uuid">5a9390a0-5077-46b6-8f6c-b3b308db8b1d</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:74:11:5a"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <target dev="tap50b7281e-d0"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/console.log" append="off"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:30:19 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:30:19 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:30:19 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:30:19 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.100 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Preparing to wait for external event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.101 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.101 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.102 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.103 182627 DEBUG nova.virt.libvirt.vif [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:13Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.104 182627 DEBUG nova.network.os_vif_util [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.105 182627 DEBUG nova.network.os_vif_util [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.105 182627 DEBUG os_vif [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.107 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.108 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.112 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b7281e-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.112 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50b7281e-d0, col_values=(('external_ids', {'iface-id': '50b7281e-d0dc-4caf-a920-24203f11da00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:11:5a', 'vm-uuid': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:19 np0005592767 NetworkManager[54973]: <info>  [1769121019.1157] manager: (tap50b7281e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.121 182627 INFO os_vif [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0')#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.135 182627 DEBUG nova.compute.manager [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.135 182627 DEBUG nova.compute.manager [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing instance network info cache due to event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.136 182627 DEBUG oslo_concurrency.lockutils [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.136 182627 DEBUG oslo_concurrency.lockutils [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.136 182627 DEBUG nova.network.neutron [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.177 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.177 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.177 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:74:11:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.178 182627 INFO nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Using config drive#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.267 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.268 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5720MB free_disk=73.23579406738281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.268 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.268 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.366 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 5a9390a0-5077-46b6-8f6c-b3b308db8b1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.367 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.367 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.431 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.459 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.486 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.486 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.816 182627 INFO nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Creating config drive at /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.826 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7rucdi9o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:19 np0005592767 nova_compute[182623]: 2026-01-22 22:30:19.973 182627 DEBUG oslo_concurrency.processutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7rucdi9o" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:20 np0005592767 kernel: tap50b7281e-d0: entered promiscuous mode
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.0290] manager: (tap50b7281e-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Jan 22 17:30:20 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:20Z|00299|binding|INFO|Claiming lport 50b7281e-d0dc-4caf-a920-24203f11da00 for this chassis.
Jan 22 17:30:20 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:20Z|00300|binding|INFO|50b7281e-d0dc-4caf-a920-24203f11da00: Claiming fa:16:3e:74:11:5a 10.100.0.13
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.038 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:11:5a 10.100.0.13'], port_security=['fa:16:3e:74:11:5a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=50b7281e-d0dc-4caf-a920-24203f11da00) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.040 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 50b7281e-d0dc-4caf-a920-24203f11da00 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.042 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:30:20 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:20Z|00301|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 ovn-installed in OVS
Jan 22 17:30:20 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:20Z|00302|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 up in Southbound
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.051 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[94b4b1d5-e0f9-4db3-a1f8-cb183e342d56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.051 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.053 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.053 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ba1bf0-df88-4ca7-8bf3-b6e72229d7ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.053 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf8123a-d5fe-4127-b2cd-8d5430f12b02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 systemd-udevd[222477]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.064 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[093f3642-a52d-421d-9935-c29b648df690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 systemd-machined[153912]: New machine qemu-39-instance-00000051.
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.0696] device (tap50b7281e-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.0706] device (tap50b7281e-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:30:20 np0005592767 systemd[1]: Started Virtual Machine qemu-39-instance-00000051.
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.087 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7adcc2bf-2537-49b0-b900-c2e49655f26d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.114 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[108e87e2-45d7-4993-9da6-d38d46424193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 systemd-udevd[222482]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.121 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[804128c4-1d16-4c26-bf40-c5adbfb571ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.1222] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.147 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d54c4025-0b26-4eb8-bb89-c4984a0f1180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.151 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6edcc91a-50a2-4d87-aabd-a2cc1715e520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.1690] device (tap354683a7-30): carrier: link connected
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.172 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fc306eaf-7991-4b0a-b7a0-e6a073d4d5c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.190 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[db31d53e-fd8a-4e0d-baa9-b90720fc421c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461674, 'reachable_time': 40448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222510, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.202 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c498e8f7-1bf7-4bc2-8f11-55304f2f5a6a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461674, 'tstamp': 461674}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222513, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.222 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56c99842-6ddf-4c28-8b84-9cbb1eaf73e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461674, 'reachable_time': 40448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222514, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.251 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6a98498e-125d-4fcc-995d-3f9029849743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.301 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121020.3010354, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.302 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Started (Lifecycle Event)#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.319 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eb0adf-f66a-4cb8-9619-cc3b50dbe007]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.320 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.320 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.321 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:20 np0005592767 NetworkManager[54973]: <info>  [1769121020.3235] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.323 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.325 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:20 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:20Z|00303|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=0)
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.328 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.328 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[385f62a3-e5b0-4ff0-9511-2f7927be9518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.329 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.330 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:20.330 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.333 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.337 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121020.301149, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.338 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.340 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.366 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.370 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.389 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.417 182627 DEBUG nova.compute.manager [req-313a5675-6906-40b5-b636-f948527626b2 req-018dd5ce-8e78-4935-bf45-2e28a94e81ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.417 182627 DEBUG oslo_concurrency.lockutils [req-313a5675-6906-40b5-b636-f948527626b2 req-018dd5ce-8e78-4935-bf45-2e28a94e81ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.418 182627 DEBUG oslo_concurrency.lockutils [req-313a5675-6906-40b5-b636-f948527626b2 req-018dd5ce-8e78-4935-bf45-2e28a94e81ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.418 182627 DEBUG oslo_concurrency.lockutils [req-313a5675-6906-40b5-b636-f948527626b2 req-018dd5ce-8e78-4935-bf45-2e28a94e81ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.418 182627 DEBUG nova.compute.manager [req-313a5675-6906-40b5-b636-f948527626b2 req-018dd5ce-8e78-4935-bf45-2e28a94e81ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Processing event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.419 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.422 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.423 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121020.4223537, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.423 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.428 182627 INFO nova.virt.libvirt.driver [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance spawned successfully.#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.428 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.465 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.466 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.466 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.467 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.467 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.467 182627 DEBUG nova.virt.libvirt.driver [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.471 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.475 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.534 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.563 182627 INFO nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Took 7.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.563 182627 DEBUG nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.667 182627 INFO nova.compute.manager [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Took 7.76 seconds to build instance.#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.682 182627 DEBUG oslo_concurrency.lockutils [None req-9a256cfd-82f5-4a26-a5f9-77bf557d4303 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:20 np0005592767 podman[222551]: 2026-01-22 22:30:20.715020471 +0000 UTC m=+0.068875303 container create ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:30:20 np0005592767 systemd[1]: Started libpod-conmon-ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80.scope.
Jan 22 17:30:20 np0005592767 podman[222551]: 2026-01-22 22:30:20.680096841 +0000 UTC m=+0.033951693 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:30:20 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:30:20 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a2075bc9fa9122caed72600041ec15218f34d6e083d4c3afebea41c7669d3d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:30:20 np0005592767 podman[222551]: 2026-01-22 22:30:20.798759263 +0000 UTC m=+0.152614125 container init ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.801 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:20 np0005592767 podman[222551]: 2026-01-22 22:30:20.805649159 +0000 UTC m=+0.159503991 container start ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:30:20 np0005592767 podman[222564]: 2026-01-22 22:30:20.816125285 +0000 UTC m=+0.061115492 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:30:20 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [NOTICE]   (222597) : New worker (222602) forked
Jan 22 17:30:20 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [NOTICE]   (222597) : Loading success.
Jan 22 17:30:20 np0005592767 podman[222567]: 2026-01-22 22:30:20.856593072 +0000 UTC m=+0.092322287 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.932 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.932 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.965 182627 DEBUG nova.network.neutron [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updated VIF entry in instance network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.966 182627 DEBUG nova.network.neutron [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.968 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:30:20 np0005592767 nova_compute[182623]: 2026-01-22 22:30:20.985 182627 DEBUG oslo_concurrency.lockutils [req-bad26293-765b-4bff-8f39-223dc122f80e req-5a081f5e-e43d-41e4-a81e-55fdbc15a21e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.058 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.058 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.068 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.069 182627 INFO nova.compute.claims [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.226 182627 DEBUG nova.compute.provider_tree [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.245 182627 DEBUG nova.scheduler.client.report [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.271 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.272 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.343 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.344 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.395 182627 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.433 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.663 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.665 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.665 182627 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Creating image(s)#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.666 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.666 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.667 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.679 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.741 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.742 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.743 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.754 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.809 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.810 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.846 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.847 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.848 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.907 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.908 182627 DEBUG nova.virt.disk.api [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Checking if we can resize image /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.908 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.970 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.971 182627 DEBUG nova.virt.disk.api [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Cannot resize image /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.971 182627 DEBUG nova.objects.instance [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'migration_context' on Instance uuid 1d47834b-5f72-4020-a7cf-5071d682b0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.992 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.993 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Ensure instance console log exists: /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.994 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.995 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:21 np0005592767 nova_compute[182623]: 2026-01-22 22:30:21.996 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:22 np0005592767 nova_compute[182623]: 2026-01-22 22:30:22.378 182627 DEBUG nova.policy [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '24157ae704064825a4f59adf1d187391', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:30:22 np0005592767 nova_compute[182623]: 2026-01-22 22:30:22.487 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.470 182627 DEBUG nova.compute.manager [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.471 182627 DEBUG oslo_concurrency.lockutils [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.471 182627 DEBUG oslo_concurrency.lockutils [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.472 182627 DEBUG oslo_concurrency.lockutils [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.472 182627 DEBUG nova.compute.manager [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:23 np0005592767 nova_compute[182623]: 2026-01-22 22:30:23.473 182627 WARNING nova.compute.manager [req-ef86074b-71b9-425c-b6c4-2380b798ba39 req-19edb650-082a-4744-9374-eda3494f408f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:30:24 np0005592767 nova_compute[182623]: 2026-01-22 22:30:24.160 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:24 np0005592767 nova_compute[182623]: 2026-01-22 22:30:24.410 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121009.4091184, c98a3885-eda6-4fd8-a2c3-73b2a825cbd3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:24 np0005592767 nova_compute[182623]: 2026-01-22 22:30:24.411 182627 INFO nova.compute.manager [-] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:30:24 np0005592767 nova_compute[182623]: 2026-01-22 22:30:24.434 182627 DEBUG nova.compute.manager [None req-7ebea95d-44b4-40d2-9708-5d5542dfc9da - - - - - -] [instance: c98a3885-eda6-4fd8-a2c3-73b2a825cbd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:24 np0005592767 nova_compute[182623]: 2026-01-22 22:30:24.884 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Successfully created port: 2e5d9578-19f2-4515-ab60-e0228580e897 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:30:25 np0005592767 nova_compute[182623]: 2026-01-22 22:30:25.804 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:26 np0005592767 nova_compute[182623]: 2026-01-22 22:30:26.055 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Successfully updated port: 2e5d9578-19f2-4515-ab60-e0228580e897 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:30:26 np0005592767 nova_compute[182623]: 2026-01-22 22:30:26.073 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:26 np0005592767 nova_compute[182623]: 2026-01-22 22:30:26.073 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquired lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:26 np0005592767 nova_compute[182623]: 2026-01-22 22:30:26.074 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:30:26 np0005592767 nova_compute[182623]: 2026-01-22 22:30:26.274 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:30:27 np0005592767 podman[222638]: 2026-01-22 22:30:27.161053621 +0000 UTC m=+0.065247770 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:30:27 np0005592767 nova_compute[182623]: 2026-01-22 22:30:27.446 182627 DEBUG nova.compute.manager [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-changed-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:27 np0005592767 nova_compute[182623]: 2026-01-22 22:30:27.446 182627 DEBUG nova.compute.manager [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Refreshing instance network info cache due to event network-changed-2e5d9578-19f2-4515-ab60-e0228580e897. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:30:27 np0005592767 nova_compute[182623]: 2026-01-22 22:30:27.446 182627 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.120 182627 DEBUG nova.network.neutron [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Updating instance_info_cache with network_info: [{"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.158 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Releasing lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.158 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Instance network_info: |[{"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.159 182627 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.159 182627 DEBUG nova.network.neutron [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Refreshing network info cache for port 2e5d9578-19f2-4515-ab60-e0228580e897 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.162 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Start _get_guest_xml network_info=[{"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.167 182627 WARNING nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.173 182627 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.174 182627 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.178 182627 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.178 182627 DEBUG nova.virt.libvirt.host [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.179 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.180 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.180 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.180 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.181 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.181 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.181 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.181 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.182 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.182 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.182 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.182 182627 DEBUG nova.virt.hardware [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.187 182627 DEBUG nova.virt.libvirt.vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-2',id=83,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:21Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=1d47834b-5f72-4020-a7cf-5071d682b0d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.187 182627 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.188 182627 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.189 182627 DEBUG nova.objects.instance [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d47834b-5f72-4020-a7cf-5071d682b0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.206 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <uuid>1d47834b-5f72-4020-a7cf-5071d682b0d3</uuid>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <name>instance-00000053</name>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:name>tempest-ListServersNegativeTestJSON-server-227483673-2</nova:name>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:30:28</nova:creationTime>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:user uuid="24157ae704064825a4f59adf1d187391">tempest-ListServersNegativeTestJSON-1929749532-project-member</nova:user>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:project uuid="0604aab7ee464a1ca74c3ef627dcc854">tempest-ListServersNegativeTestJSON-1929749532</nova:project>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        <nova:port uuid="2e5d9578-19f2-4515-ab60-e0228580e897">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="serial">1d47834b-5f72-4020-a7cf-5071d682b0d3</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="uuid">1d47834b-5f72-4020-a7cf-5071d682b0d3</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.config"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:d5:c9:9e"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <target dev="tap2e5d9578-19"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/console.log" append="off"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:30:28 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:30:28 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:30:28 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:30:28 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.207 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Preparing to wait for external event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.208 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.208 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.208 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.209 182627 DEBUG nova.virt.libvirt.vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-2',id=83,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:30:21Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=1d47834b-5f72-4020-a7cf-5071d682b0d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.210 182627 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.210 182627 DEBUG nova.network.os_vif_util [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.211 182627 DEBUG os_vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.211 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.212 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.212 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.219 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.219 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e5d9578-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.220 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e5d9578-19, col_values=(('external_ids', {'iface-id': '2e5d9578-19f2-4515-ab60-e0228580e897', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:c9:9e', 'vm-uuid': '1d47834b-5f72-4020-a7cf-5071d682b0d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.222 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:28 np0005592767 NetworkManager[54973]: <info>  [1769121028.2234] manager: (tap2e5d9578-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.224 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.230 182627 INFO os_vif [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19')#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.297 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.298 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.298 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] No VIF found with MAC fa:16:3e:d5:c9:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:30:28 np0005592767 nova_compute[182623]: 2026-01-22 22:30:28.299 182627 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Using config drive#033[00m
Jan 22 17:30:29 np0005592767 nova_compute[182623]: 2026-01-22 22:30:29.747 182627 INFO nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Creating config drive at /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.config#033[00m
Jan 22 17:30:29 np0005592767 nova_compute[182623]: 2026-01-22 22:30:29.754 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4czhx7r9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:29 np0005592767 nova_compute[182623]: 2026-01-22 22:30:29.895 182627 DEBUG oslo_concurrency.processutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4czhx7r9" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:29 np0005592767 NetworkManager[54973]: <info>  [1769121029.9759] manager: (tap2e5d9578-19): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Jan 22 17:30:29 np0005592767 kernel: tap2e5d9578-19: entered promiscuous mode
Jan 22 17:30:29 np0005592767 nova_compute[182623]: 2026-01-22 22:30:29.979 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:29Z|00304|binding|INFO|Claiming lport 2e5d9578-19f2-4515-ab60-e0228580e897 for this chassis.
Jan 22 17:30:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:29Z|00305|binding|INFO|2e5d9578-19f2-4515-ab60-e0228580e897: Claiming fa:16:3e:d5:c9:9e 10.100.0.12
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:29.993 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:c9:9e 10.100.0.12'], port_security=['fa:16:3e:d5:c9:9e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d47834b-5f72-4020-a7cf-5071d682b0d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2e5d9578-19f2-4515-ab60-e0228580e897) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.002 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2e5d9578-19f2-4515-ab60-e0228580e897 in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 bound to our chassis#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.006 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b03cd250-02c3-425c-a1d4-c454aa74a746#033[00m
Jan 22 17:30:30 np0005592767 systemd-udevd[222683]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.021 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5825c6fc-7403-408d-9935-d2e78bde0114]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.023 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb03cd250-01 in ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.025 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb03cd250-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.025 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a48c68d6-f114-4470-af84-e3adcd702985]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 systemd-machined[153912]: New machine qemu-40-instance-00000053.
Jan 22 17:30:30 np0005592767 NetworkManager[54973]: <info>  [1769121030.0367] device (tap2e5d9578-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:30:30 np0005592767 NetworkManager[54973]: <info>  [1769121030.0374] device (tap2e5d9578-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.038 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f08c34-b28f-4532-9fed-e3586f1530f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.057 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[327ecb29-656e-44f8-b813-e8200d69dcc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 systemd[1]: Started Virtual Machine qemu-40-instance-00000053.
Jan 22 17:30:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:30Z|00306|binding|INFO|Setting lport 2e5d9578-19f2-4515-ab60-e0228580e897 ovn-installed in OVS
Jan 22 17:30:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:30Z|00307|binding|INFO|Setting lport 2e5d9578-19f2-4515-ab60-e0228580e897 up in Southbound
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.084 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2118a7-495c-4a5d-84ab-e7b81bc9f45a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.144 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b8451ff3-15a2-4d8d-ab9a-921eca524065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.153 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9e6c45-d48a-4787-b993-1a9577101290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 NetworkManager[54973]: <info>  [1769121030.1588] manager: (tapb03cd250-00): new Veth device (/org/freedesktop/NetworkManager/Devices/149)
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.203 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd40326-4231-4442-8f1f-75be1291332f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.207 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a90e3410-43cf-4a1a-b557-25b438d39c9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 NetworkManager[54973]: <info>  [1769121030.2445] device (tapb03cd250-00): carrier: link connected
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.253 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e08eb9-9682-4ff8-abdd-21215c84cb75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.279 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[11049c5d-25c5-4a91-9018-88ef6de261a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462682, 'reachable_time': 20259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222723, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.303 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[92cbeb9e-8884-49ac-b70b-5b20293822ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:6705'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 462682, 'tstamp': 462682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222724, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.330 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[df36c4d3-1200-40c3-a4ee-e928d6f7c9c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb03cd250-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:67:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462682, 'reachable_time': 20259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222726, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.346 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121030.346016, 1d47834b-5f72-4020-a7cf-5071d682b0d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.347 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.376 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.379 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[147aa7e0-ec0d-4f82-ae4a-fd0016bfa1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.383 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121030.3476071, 1d47834b-5f72-4020-a7cf-5071d682b0d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.384 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.410 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.413 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.431 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.468 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[afc214b7-6f71-4d99-bac8-80b06e88174c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.470 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb03cd250-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.472 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 kernel: tapb03cd250-00: entered promiscuous mode
Jan 22 17:30:30 np0005592767 NetworkManager[54973]: <info>  [1769121030.4749] manager: (tapb03cd250-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.474 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.475 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb03cd250-00, col_values=(('external_ids', {'iface-id': 'a20b41a8-fffe-4d8c-83ca-cc00cb778065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.476 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:30Z|00308|binding|INFO|Releasing lport a20b41a8-fffe-4d8c-83ca-cc00cb778065 from this chassis (sb_readonly=0)
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.499 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.500 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.500 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.502 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[21f961e7-8feb-42c1-b038-e079325c05cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.502 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/b03cd250-02c3-425c-a1d4-c454aa74a746.pid.haproxy
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID b03cd250-02c3-425c-a1d4-c454aa74a746
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:30:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:30.503 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'env', 'PROCESS_TAG=haproxy-b03cd250-02c3-425c-a1d4-c454aa74a746', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b03cd250-02c3-425c-a1d4-c454aa74a746.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.611 182627 DEBUG nova.network.neutron [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Updated VIF entry in instance network info cache for port 2e5d9578-19f2-4515-ab60-e0228580e897. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.612 182627 DEBUG nova.network.neutron [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Updating instance_info_cache with network_info: [{"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.631 182627 DEBUG nova.compute.manager [req-2d297746-ae79-4f9b-ba49-7d8fe0ae5038 req-e927bc09-6773-419f-b36f-b1afa274277d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.632 182627 DEBUG oslo_concurrency.lockutils [req-2d297746-ae79-4f9b-ba49-7d8fe0ae5038 req-e927bc09-6773-419f-b36f-b1afa274277d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.632 182627 DEBUG oslo_concurrency.lockutils [req-2d297746-ae79-4f9b-ba49-7d8fe0ae5038 req-e927bc09-6773-419f-b36f-b1afa274277d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.632 182627 DEBUG oslo_concurrency.lockutils [req-2d297746-ae79-4f9b-ba49-7d8fe0ae5038 req-e927bc09-6773-419f-b36f-b1afa274277d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.633 182627 DEBUG nova.compute.manager [req-2d297746-ae79-4f9b-ba49-7d8fe0ae5038 req-e927bc09-6773-419f-b36f-b1afa274277d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Processing event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.633 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.637 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121030.6373317, 1d47834b-5f72-4020-a7cf-5071d682b0d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.637 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.639 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.645 182627 DEBUG oslo_concurrency.lockutils [req-87b84794-1245-4fee-aa1b-011b36648920 req-0efbe95d-71e1-47a3-a227-307c2352bafd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1d47834b-5f72-4020-a7cf-5071d682b0d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.647 182627 INFO nova.virt.libvirt.driver [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Instance spawned successfully.#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.648 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.664 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.670 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.722 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.723 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.724 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.726 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.727 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.728 182627 DEBUG nova.virt.libvirt.driver [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.735 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.736 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.736 182627 DEBUG nova.network.neutron [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.769 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.845 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.852 182627 INFO nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Took 9.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.853 182627 DEBUG nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.942 182627 INFO nova.compute.manager [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Took 9.92 seconds to build instance.#033[00m
Jan 22 17:30:30 np0005592767 nova_compute[182623]: 2026-01-22 22:30:30.988 182627 DEBUG oslo_concurrency.lockutils [None req-8119b7fd-3bf9-431c-a6ed-35936af7fdca 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:31 np0005592767 podman[222758]: 2026-01-22 22:30:31.000326113 +0000 UTC m=+0.068531813 container create 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:31 np0005592767 systemd[1]: Started libpod-conmon-58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb.scope.
Jan 22 17:30:31 np0005592767 podman[222758]: 2026-01-22 22:30:30.967959526 +0000 UTC m=+0.036165236 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:30:31 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:30:31 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c48d2e24121b710048d338ade4453cd1859aa9355bc9e78ace9d1858bdb6349/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:30:31 np0005592767 podman[222758]: 2026-01-22 22:30:31.098602838 +0000 UTC m=+0.166808528 container init 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:30:31 np0005592767 podman[222758]: 2026-01-22 22:30:31.112207313 +0000 UTC m=+0.180412993 container start 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:30:31 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [NOTICE]   (222777) : New worker (222779) forked
Jan 22 17:30:31 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [NOTICE]   (222777) : Loading success.
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.766 182627 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.766 182627 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.767 182627 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.767 182627 DEBUG oslo_concurrency.lockutils [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.767 182627 DEBUG nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] No waiting events found dispatching network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:32 np0005592767 nova_compute[182623]: 2026-01-22 22:30:32.767 182627 WARNING nova.compute.manager [req-616da5a1-6888-4e93-8dd5-97c3c5387eef req-6959a4bc-0e0e-44ba-a9a3-c31da6e694d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received unexpected event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.195 182627 DEBUG nova.network.neutron [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.221 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.223 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.346 182627 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.347 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Creating file /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/dbcaf62710cf45e2a56f60775ffd1859.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.347 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/dbcaf62710cf45e2a56f60775ffd1859.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:33Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:11:5a 10.100.0.13
Jan 22 17:30:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:33Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:11:5a 10.100.0.13
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.909 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/dbcaf62710cf45e2a56f60775ffd1859.tmp" returned: 1 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.910 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/dbcaf62710cf45e2a56f60775ffd1859.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.911 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Creating directory /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 22 17:30:33 np0005592767 nova_compute[182623]: 2026-01-22 22:30:33.911 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:34 np0005592767 nova_compute[182623]: 2026-01-22 22:30:34.138 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:34 np0005592767 nova_compute[182623]: 2026-01-22 22:30:34.144 182627 DEBUG nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:30:35 np0005592767 nova_compute[182623]: 2026-01-22 22:30:35.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 kernel: tap50b7281e-d0 (unregistering): left promiscuous mode
Jan 22 17:30:36 np0005592767 NetworkManager[54973]: <info>  [1769121036.3216] device (tap50b7281e-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.320 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:36Z|00309|binding|INFO|Releasing lport 50b7281e-d0dc-4caf-a920-24203f11da00 from this chassis (sb_readonly=0)
Jan 22 17:30:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:36Z|00310|binding|INFO|Setting lport 50b7281e-d0dc-4caf-a920-24203f11da00 down in Southbound
Jan 22 17:30:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:36Z|00311|binding|INFO|Removing iface tap50b7281e-d0 ovn-installed in OVS
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.341 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:11:5a 10.100.0.13'], port_security=['fa:16:3e:74:11:5a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5a9390a0-5077-46b6-8f6c-b3b308db8b1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=50b7281e-d0dc-4caf-a920-24203f11da00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.343 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 50b7281e-d0dc-4caf-a920-24203f11da00 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.345 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.346 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[46c79668-ac8f-41ba-be3f-2effc8f5191d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.348 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.353 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 22 17:30:36 np0005592767 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000051.scope: Consumed 12.514s CPU time.
Jan 22 17:30:36 np0005592767 systemd-machined[153912]: Machine qemu-39-instance-00000051 terminated.
Jan 22 17:30:36 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [NOTICE]   (222597) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:36 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [NOTICE]   (222597) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:36 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [WARNING]  (222597) : Exiting Master process...
Jan 22 17:30:36 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [ALERT]    (222597) : Current worker (222602) exited with code 143 (Terminated)
Jan 22 17:30:36 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[222568]: [WARNING]  (222597) : All workers exited. Exiting... (0)
Jan 22 17:30:36 np0005592767 systemd[1]: libpod-ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80.scope: Deactivated successfully.
Jan 22 17:30:36 np0005592767 podman[222825]: 2026-01-22 22:30:36.49356185 +0000 UTC m=+0.047016193 container died ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:30:36 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:36 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8a2075bc9fa9122caed72600041ec15218f34d6e083d4c3afebea41c7669d3d5-merged.mount: Deactivated successfully.
Jan 22 17:30:36 np0005592767 podman[222825]: 2026-01-22 22:30:36.530307311 +0000 UTC m=+0.083761644 container cleanup ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 17:30:36 np0005592767 systemd[1]: libpod-conmon-ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80.scope: Deactivated successfully.
Jan 22 17:30:36 np0005592767 podman[222856]: 2026-01-22 22:30:36.60436687 +0000 UTC m=+0.049882535 container remove ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.603 182627 DEBUG nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.605 182627 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.605 182627 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.605 182627 DEBUG oslo_concurrency.lockutils [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.605 182627 DEBUG nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.605 182627 WARNING nova.compute.manager [req-033e836b-b5d7-499b-a70a-e31d8a4232ee req-5fa47dbf-90a0-4d66-afc6-d317652e3db9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-unplugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.613 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[26d63e81-523f-4be7-a024-6805b0d5e23f]: (4, ('Thu Jan 22 10:30:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80)\nff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80\nThu Jan 22 10:30:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (ff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80)\nff43cd4dc8c1fd9bc7641ae4045e76db29aa661311c8722498041cded86f6a80\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.615 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84dd1016-ba40-4282-8210-404d50ace437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.616 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.638 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:30:36 np0005592767 nova_compute[182623]: 2026-01-22 22:30:36.656 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.659 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5389a2ea-c030-4b53-b0d9-1031346a1f8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.677 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb788fcc-a7ed-4821-95ad-ee43d39ecd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.679 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8d053b-bc70-4b40-b8f0-32c2d54c814d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.693 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[024756c4-3a3e-4cc1-8bba-b0c6436dbafe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461668, 'reachable_time': 43068, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222889, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:36 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.697 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:30:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:36.698 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f1281496-1ff7-410b-95ae-b29c4c1c9c4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.162 182627 INFO nova.virt.libvirt.driver [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance shutdown successfully after 3 seconds.#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.172 182627 INFO nova.virt.libvirt.driver [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Instance destroyed successfully.#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.173 182627 DEBUG nova.virt.libvirt.vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:30Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.174 182627 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:74:11:5a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.175 182627 DEBUG nova.network.os_vif_util [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.177 182627 DEBUG os_vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.184 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.184 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b7281e-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.194 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.197 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.200 182627 INFO os_vif [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0')#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.205 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.297 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.301 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.392 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.396 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Copying file /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk to 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:30:37 np0005592767 nova_compute[182623]: 2026-01-22 22:30:37.397 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.033 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "scp -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk" returned: 0 in 0.636s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.034 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Copying file /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.035 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.config 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.248 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "scp -C -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.config 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.config" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.250 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Copying file /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.250 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.info 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.491 182627 DEBUG oslo_concurrency.processutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "scp -C -r /var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d_resize/disk.info 192.168.122.100:/var/lib/nova/instances/5a9390a0-5077-46b6-8f6c-b3b308db8b1d/disk.info" returned: 0 in 0.241s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.613 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.613 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.613 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.614 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.614 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.625 182627 INFO nova.compute.manager [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Terminating instance#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.642 182627 DEBUG nova.compute.manager [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:30:38 np0005592767 kernel: tap2e5d9578-19 (unregistering): left promiscuous mode
Jan 22 17:30:38 np0005592767 NetworkManager[54973]: <info>  [1769121038.6650] device (tap2e5d9578-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.720 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:38Z|00312|binding|INFO|Releasing lport 2e5d9578-19f2-4515-ab60-e0228580e897 from this chassis (sb_readonly=0)
Jan 22 17:30:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:38Z|00313|binding|INFO|Setting lport 2e5d9578-19f2-4515-ab60-e0228580e897 down in Southbound
Jan 22 17:30:38 np0005592767 ovn_controller[94769]: 2026-01-22T22:30:38Z|00314|binding|INFO|Removing iface tap2e5d9578-19 ovn-installed in OVS
Jan 22 17:30:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:38.735 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:c9:9e 10.100.0.12'], port_security=['fa:16:3e:d5:c9:9e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d47834b-5f72-4020-a7cf-5071d682b0d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b03cd250-02c3-425c-a1d4-c454aa74a746', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0604aab7ee464a1ca74c3ef627dcc854', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12be1ce8-24f3-4356-bc90-b009c3a4fd93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=462ad325-898a-496d-9f84-227dfb38da3d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2e5d9578-19f2-4515-ab60-e0228580e897) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:38.736 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2e5d9578-19f2-4515-ab60-e0228580e897 in datapath b03cd250-02c3-425c-a1d4-c454aa74a746 unbound from our chassis#033[00m
Jan 22 17:30:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:38.737 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b03cd250-02c3-425c-a1d4-c454aa74a746, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:30:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:38.738 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c26606fe-2ba4-43f6-a49c-d90537955e7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:38.739 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 namespace which is not needed anymore#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.749 182627 DEBUG neutronclient.v2_0.client [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 50b7281e-d0dc-4caf-a920-24203f11da00 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.758 182627 DEBUG nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.759 182627 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.760 182627 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.760 182627 DEBUG oslo_concurrency.lockutils [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.761 182627 DEBUG nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.761 182627 WARNING nova.compute.manager [req-155e6e98-5eba-4126-b83f-2bd901409f43 req-123b4cb8-16bc-4717-8641-c40eedd6992b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:30:38 np0005592767 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 22 17:30:38 np0005592767 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Consumed 8.352s CPU time.
Jan 22 17:30:38 np0005592767 systemd-machined[153912]: Machine qemu-40-instance-00000053 terminated.
Jan 22 17:30:38 np0005592767 NetworkManager[54973]: <info>  [1769121038.8665] manager: (tap2e5d9578-19): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.892 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.893 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.893 182627 DEBUG oslo_concurrency.lockutils [None req-5adcee0b-9339-430c-b520-0fbe988a5740 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:38 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [NOTICE]   (222777) : haproxy version is 2.8.14-c23fe91
Jan 22 17:30:38 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [NOTICE]   (222777) : path to executable is /usr/sbin/haproxy
Jan 22 17:30:38 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [WARNING]  (222777) : Exiting Master process...
Jan 22 17:30:38 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [ALERT]    (222777) : Current worker (222779) exited with code 143 (Terminated)
Jan 22 17:30:38 np0005592767 neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746[222773]: [WARNING]  (222777) : All workers exited. Exiting... (0)
Jan 22 17:30:38 np0005592767 systemd[1]: libpod-58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb.scope: Deactivated successfully.
Jan 22 17:30:38 np0005592767 podman[222924]: 2026-01-22 22:30:38.916228604 +0000 UTC m=+0.054509676 container died 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.916 182627 INFO nova.virt.libvirt.driver [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Instance destroyed successfully.#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.917 182627 DEBUG nova.objects.instance [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lazy-loading 'resources' on Instance uuid 1d47834b-5f72-4020-a7cf-5071d682b0d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb-userdata-shm.mount: Deactivated successfully.
Jan 22 17:30:38 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8c48d2e24121b710048d338ade4453cd1859aa9355bc9e78ace9d1858bdb6349-merged.mount: Deactivated successfully.
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.950 182627 DEBUG nova.virt.libvirt.vif [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-227483673',display_name='tempest-ListServersNegativeTestJSON-server-227483673-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-227483673-2',id=83,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-22T22:30:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0604aab7ee464a1ca74c3ef627dcc854',ramdisk_id='',reservation_id='r-1h9pdnzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1929749532',owner_user_name='tempest-ListServersNegativeTestJSON-1929749532-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:30Z,user_data=None,user_id='24157ae704064825a4f59adf1d187391',uuid=1d47834b-5f72-4020-a7cf-5071d682b0d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.951 182627 DEBUG nova.network.os_vif_util [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converting VIF {"id": "2e5d9578-19f2-4515-ab60-e0228580e897", "address": "fa:16:3e:d5:c9:9e", "network": {"id": "b03cd250-02c3-425c-a1d4-c454aa74a746", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-844272303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0604aab7ee464a1ca74c3ef627dcc854", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e5d9578-19", "ovs_interfaceid": "2e5d9578-19f2-4515-ab60-e0228580e897", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:38 np0005592767 podman[222924]: 2026-01-22 22:30:38.951509223 +0000 UTC m=+0.089790285 container cleanup 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.952 182627 DEBUG nova.network.os_vif_util [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.952 182627 DEBUG os_vif [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.954 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.955 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e5d9578-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.961 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.963 182627 INFO os_vif [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c9:9e,bridge_name='br-int',has_traffic_filtering=True,id=2e5d9578-19f2-4515-ab60-e0228580e897,network=Network(b03cd250-02c3-425c-a1d4-c454aa74a746),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e5d9578-19')#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.964 182627 INFO nova.virt.libvirt.driver [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Deleting instance files /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3_del#033[00m
Jan 22 17:30:38 np0005592767 nova_compute[182623]: 2026-01-22 22:30:38.966 182627 INFO nova.virt.libvirt.driver [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Deletion of /var/lib/nova/instances/1d47834b-5f72-4020-a7cf-5071d682b0d3_del complete#033[00m
Jan 22 17:30:38 np0005592767 systemd[1]: libpod-conmon-58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb.scope: Deactivated successfully.
Jan 22 17:30:39 np0005592767 podman[222968]: 2026-01-22 22:30:39.026371805 +0000 UTC m=+0.047637271 container remove 58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.032 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[755a0b03-a008-4cde-9cfe-88267e833f50]: (4, ('Thu Jan 22 10:30:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb)\n58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb\nThu Jan 22 10:30:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 (58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb)\n58cdb14b6016bf877e4308fb5eb69b5aa48ea93b82e69cc77b56ce3006bc3bfb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.034 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[10d45daa-ce85-4feb-9549-3919eda023f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.035 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03cd250-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:39 np0005592767 kernel: tapb03cd250-00: left promiscuous mode
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.042 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd50743-43c2-4899-a718-e34f8d770a1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.044 182627 INFO nova.compute.manager [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.045 182627 DEBUG oslo.service.loopingcall [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.045 182627 DEBUG nova.compute.manager [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.046 182627 DEBUG nova.network.neutron [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[039a1578-bd03-416c-99da-ee3a7fd48199]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.061 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61dc13b2-b058-4458-b1c5-22fd4e9d9cc1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.085 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ff72d21a-171e-4140-ac29-5de3fd2b50fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462671, 'reachable_time': 18191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222980, 'error': None, 'target': 'ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.087 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b03cd250-02c3-425c-a1d4-c454aa74a746 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:30:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:39.087 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d39b742d-8515-4f4e-80b4-5e3755451173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:30:39 np0005592767 systemd[1]: run-netns-ovnmeta\x2db03cd250\x2d02c3\x2d425c\x2da1d4\x2dc454aa74a746.mount: Deactivated successfully.
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.197 182627 DEBUG nova.compute.manager [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-unplugged-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.198 182627 DEBUG oslo_concurrency.lockutils [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.198 182627 DEBUG oslo_concurrency.lockutils [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.198 182627 DEBUG oslo_concurrency.lockutils [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.199 182627 DEBUG nova.compute.manager [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] No waiting events found dispatching network-vif-unplugged-2e5d9578-19f2-4515-ab60-e0228580e897 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.199 182627 DEBUG nova.compute.manager [req-5473eafe-1d08-4506-a611-16f528b7890f req-316ebda5-d83d-4c0f-a9b7-86b5946e69a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-unplugged-2e5d9578-19f2-4515-ab60-e0228580e897 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.896 182627 DEBUG nova.network.neutron [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:39 np0005592767 nova_compute[182623]: 2026-01-22 22:30:39.927 182627 INFO nova.compute.manager [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Took 0.88 seconds to deallocate network for instance.#033[00m
Jan 22 17:30:40 np0005592767 podman[222981]: 2026-01-22 22:30:40.166778877 +0000 UTC m=+0.080666616 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.176 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.176 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.338 182627 DEBUG nova.compute.provider_tree [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.354 182627 DEBUG nova.scheduler.client.report [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.381 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.429 182627 INFO nova.scheduler.client.report [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Deleted allocations for instance 1d47834b-5f72-4020-a7cf-5071d682b0d3#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.525 182627 DEBUG oslo_concurrency.lockutils [None req-c5e4c9f6-6186-48e8-8e2e-1afcc248ff0c 24157ae704064825a4f59adf1d187391 0604aab7ee464a1ca74c3ef627dcc854 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:40 np0005592767 nova_compute[182623]: 2026-01-22 22:30:40.889 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.019 182627 DEBUG nova.compute.manager [req-dbb5cba1-e744-40eb-bd85-d8ff389df0f1 req-f87b17fe-c817-4632-b404-c950d829d1c8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-deleted-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.506 182627 DEBUG nova.compute.manager [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.507 182627 DEBUG nova.compute.manager [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing instance network info cache due to event network-changed-50b7281e-d0dc-4caf-a920-24203f11da00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.508 182627 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.508 182627 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.509 182627 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Refreshing network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.512 182627 DEBUG nova.compute.manager [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.513 182627 DEBUG oslo_concurrency.lockutils [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.513 182627 DEBUG oslo_concurrency.lockutils [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.514 182627 DEBUG oslo_concurrency.lockutils [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1d47834b-5f72-4020-a7cf-5071d682b0d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.514 182627 DEBUG nova.compute.manager [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] No waiting events found dispatching network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:41 np0005592767 nova_compute[182623]: 2026-01-22 22:30:41.515 182627 WARNING nova.compute.manager [req-de70b480-2e00-47d2-9802-700dc7e1d3c9 req-fff4e2b1-9a33-4ce8-acec-9649c88b1282 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Received unexpected event network-vif-plugged-2e5d9578-19f2-4515-ab60-e0228580e897 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.621 182627 DEBUG nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.622 182627 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.622 182627 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.622 182627 DEBUG oslo_concurrency.lockutils [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.622 182627 DEBUG nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.623 182627 WARNING nova.compute.manager [req-26e20ab9-ec23-4055-aeea-5b932c496fb7 req-c6d66243-bfbc-4f1c-9191-8610c370e31a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 22 17:30:43 np0005592767 nova_compute[182623]: 2026-01-22 22:30:43.958 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:44 np0005592767 nova_compute[182623]: 2026-01-22 22:30:44.389 182627 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updated VIF entry in instance network info cache for port 50b7281e-d0dc-4caf-a920-24203f11da00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:30:44 np0005592767 nova_compute[182623]: 2026-01-22 22:30:44.390 182627 DEBUG nova.network.neutron [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:44 np0005592767 nova_compute[182623]: 2026-01-22 22:30:44.404 182627 DEBUG oslo_concurrency.lockutils [req-0f770c48-f5dc-42a0-98bb-372fa52d6706 req-3c53749b-dae5-4135-baa2-7c8532c5047b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.802 182627 DEBUG nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.802 182627 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.803 182627 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.803 182627 DEBUG oslo_concurrency.lockutils [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.804 182627 DEBUG nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] No waiting events found dispatching network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.804 182627 WARNING nova.compute.manager [req-c09877b8-8ecd-4736-ae61-d9f1c8f31b7e req-806a5b0d-2854-4592-a3c9-a63775996a2f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Received unexpected event network-vif-plugged-50b7281e-d0dc-4caf-a920-24203f11da00 for instance with vm_state resized and task_state None.#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.874 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.893 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.918 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.919 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.919 182627 DEBUG nova.compute.manager [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 22 17:30:45 np0005592767 nova_compute[182623]: 2026-01-22 22:30:45.974 182627 DEBUG nova.objects.instance [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'info_cache' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:46 np0005592767 podman[223004]: 2026-01-22 22:30:46.163213081 +0000 UTC m=+0.078607149 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:30:46 np0005592767 podman[223003]: 2026-01-22 22:30:46.196571876 +0000 UTC m=+0.115311969 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:30:46 np0005592767 nova_compute[182623]: 2026-01-22 22:30:46.711 182627 DEBUG neutronclient.v2_0.client [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 50b7281e-d0dc-4caf-a920-24203f11da00 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 22 17:30:46 np0005592767 nova_compute[182623]: 2026-01-22 22:30:46.712 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:30:46 np0005592767 nova_compute[182623]: 2026-01-22 22:30:46.713 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:30:46 np0005592767 nova_compute[182623]: 2026-01-22 22:30:46.713 182627 DEBUG nova.network.neutron [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.219 182627 DEBUG nova.network.neutron [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Updating instance_info_cache with network_info: [{"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.236 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-5a9390a0-5077-46b6-8f6c-b3b308db8b1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.237 182627 DEBUG nova.objects.instance [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a9390a0-5077-46b6-8f6c-b3b308db8b1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.276 182627 DEBUG nova.virt.libvirt.vif [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1146472499',display_name='tempest-ServerDiskConfigTestJSON-server-1146472499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1146472499',id=81,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:30:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-rnxdkv5c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:30:43Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=5a9390a0-5077-46b6-8f6c-b3b308db8b1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.277 182627 DEBUG nova.network.os_vif_util [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "50b7281e-d0dc-4caf-a920-24203f11da00", "address": "fa:16:3e:74:11:5a", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50b7281e-d0", "ovs_interfaceid": "50b7281e-d0dc-4caf-a920-24203f11da00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.278 182627 DEBUG nova.network.os_vif_util [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.278 182627 DEBUG os_vif [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.280 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b7281e-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.280 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.282 182627 INFO os_vif [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:11:5a,bridge_name='br-int',has_traffic_filtering=True,id=50b7281e-d0dc-4caf-a920-24203f11da00,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50b7281e-d0')#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.283 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.283 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.343 182627 DEBUG nova.compute.provider_tree [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.362 182627 DEBUG nova.scheduler.client.report [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.414 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.566 182627 INFO nova.scheduler.client.report [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocation for migration 969cd541-4fe9-48ce-bf39-c0d737aa6531#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.662 182627 DEBUG oslo_concurrency.lockutils [None req-b51b72c6-45ac-4f99-87df-6dcd415fb761 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "5a9390a0-5077-46b6-8f6c-b3b308db8b1d" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:30:48 np0005592767 nova_compute[182623]: 2026-01-22 22:30:48.961 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:50 np0005592767 nova_compute[182623]: 2026-01-22 22:30:50.892 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:51 np0005592767 podman[223051]: 2026-01-22 22:30:51.14724558 +0000 UTC m=+0.056229245 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:30:51 np0005592767 podman[223050]: 2026-01-22 22:30:51.160199777 +0000 UTC m=+0.067235116 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:30:51 np0005592767 nova_compute[182623]: 2026-01-22 22:30:51.590 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121036.5883899, 5a9390a0-5077-46b6-8f6c-b3b308db8b1d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:51 np0005592767 nova_compute[182623]: 2026-01-22 22:30:51.590 182627 INFO nova.compute.manager [-] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:30:51 np0005592767 nova_compute[182623]: 2026-01-22 22:30:51.608 182627 DEBUG nova.compute.manager [None req-61424ff8-4547-4942-ab96-025c259fef82 - - - - - -] [instance: 5a9390a0-5077-46b6-8f6c-b3b308db8b1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:51.945 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:30:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:51.946 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:30:51 np0005592767 nova_compute[182623]: 2026-01-22 22:30:51.947 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:53 np0005592767 nova_compute[182623]: 2026-01-22 22:30:53.913 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121038.9119296, 1d47834b-5f72-4020-a7cf-5071d682b0d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:30:53 np0005592767 nova_compute[182623]: 2026-01-22 22:30:53.914 182627 INFO nova.compute.manager [-] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:30:53 np0005592767 nova_compute[182623]: 2026-01-22 22:30:53.958 182627 DEBUG nova.compute.manager [None req-dc559b63-2111-405b-bd77-f26b4a24a945 - - - - - -] [instance: 1d47834b-5f72-4020-a7cf-5071d682b0d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:30:54 np0005592767 nova_compute[182623]: 2026-01-22 22:30:54.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:55 np0005592767 nova_compute[182623]: 2026-01-22 22:30:55.895 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:30:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:30:56.949 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:30:58 np0005592767 podman[223092]: 2026-01-22 22:30:58.122584192 +0000 UTC m=+0.044631956 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:30:59 np0005592767 nova_compute[182623]: 2026-01-22 22:30:59.018 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:00 np0005592767 nova_compute[182623]: 2026-01-22 22:31:00.897 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:04 np0005592767 nova_compute[182623]: 2026-01-22 22:31:04.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:05 np0005592767 nova_compute[182623]: 2026-01-22 22:31:05.898 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.354 182627 DEBUG nova.compute.manager [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.480 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.481 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.509 182627 DEBUG nova.objects.instance [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.528 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.529 182627 INFO nova.compute.claims [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.530 182627 DEBUG nova.objects.instance [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.542 182627 DEBUG nova.objects.instance [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.585 182627 INFO nova.compute.resource_tracker [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating resource usage from migration e2e35cdb-2b92-4aaa-a145-2b0519b9b28b#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.586 182627 DEBUG nova.compute.resource_tracker [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Starting to track incoming migration e2e35cdb-2b92-4aaa-a145-2b0519b9b28b with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.623 182627 DEBUG nova.scheduler.client.report [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.647 182627 DEBUG nova.scheduler.client.report [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.647 182627 DEBUG nova.compute.provider_tree [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.670 182627 DEBUG nova.scheduler.client.report [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.693 182627 DEBUG nova.scheduler.client.report [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.753 182627 DEBUG nova.compute.provider_tree [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.773 182627 DEBUG nova.scheduler.client.report [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.805 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:07 np0005592767 nova_compute[182623]: 2026-01-22 22:31:07.806 182627 INFO nova.compute.manager [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Migrating#033[00m
Jan 22 17:31:09 np0005592767 nova_compute[182623]: 2026-01-22 22:31:09.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:09 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:31:09 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:31:09 np0005592767 systemd-logind[802]: New session 52 of user nova.
Jan 22 17:31:09 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:31:09 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:31:10 np0005592767 systemd[223124]: Queued start job for default target Main User Target.
Jan 22 17:31:10 np0005592767 systemd[223124]: Created slice User Application Slice.
Jan 22 17:31:10 np0005592767 systemd[223124]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:31:10 np0005592767 systemd[223124]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:31:10 np0005592767 systemd[223124]: Reached target Paths.
Jan 22 17:31:10 np0005592767 systemd[223124]: Reached target Timers.
Jan 22 17:31:10 np0005592767 systemd[223124]: Starting D-Bus User Message Bus Socket...
Jan 22 17:31:10 np0005592767 systemd[223124]: Starting Create User's Volatile Files and Directories...
Jan 22 17:31:10 np0005592767 systemd[223124]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:31:10 np0005592767 systemd[223124]: Reached target Sockets.
Jan 22 17:31:10 np0005592767 systemd[223124]: Finished Create User's Volatile Files and Directories.
Jan 22 17:31:10 np0005592767 systemd[223124]: Reached target Basic System.
Jan 22 17:31:10 np0005592767 systemd[223124]: Reached target Main User Target.
Jan 22 17:31:10 np0005592767 systemd[223124]: Startup finished in 434ms.
Jan 22 17:31:10 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:31:10 np0005592767 systemd[1]: Started Session 52 of User nova.
Jan 22 17:31:10 np0005592767 podman[223139]: 2026-01-22 22:31:10.43205563 +0000 UTC m=+0.059151417 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 17:31:10 np0005592767 systemd[1]: session-52.scope: Deactivated successfully.
Jan 22 17:31:10 np0005592767 systemd-logind[802]: Session 52 logged out. Waiting for processes to exit.
Jan 22 17:31:10 np0005592767 systemd-logind[802]: Removed session 52.
Jan 22 17:31:10 np0005592767 systemd-logind[802]: New session 54 of user nova.
Jan 22 17:31:10 np0005592767 systemd[1]: Started Session 54 of User nova.
Jan 22 17:31:10 np0005592767 systemd[1]: session-54.scope: Deactivated successfully.
Jan 22 17:31:10 np0005592767 systemd-logind[802]: Session 54 logged out. Waiting for processes to exit.
Jan 22 17:31:10 np0005592767 systemd-logind[802]: Removed session 54.
Jan 22 17:31:10 np0005592767 nova_compute[182623]: 2026-01-22 22:31:10.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:12.102 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:12.103 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:12.103 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:14 np0005592767 nova_compute[182623]: 2026-01-22 22:31:14.033 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:14 np0005592767 nova_compute[182623]: 2026-01-22 22:31:14.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:15 np0005592767 nova_compute[182623]: 2026-01-22 22:31:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:15 np0005592767 nova_compute[182623]: 2026-01-22 22:31:15.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:16 np0005592767 nova_compute[182623]: 2026-01-22 22:31:16.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:17 np0005592767 podman[223167]: 2026-01-22 22:31:17.1571168 +0000 UTC m=+0.068767370 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:31:17 np0005592767 podman[223166]: 2026-01-22 22:31:17.218433255 +0000 UTC m=+0.124820586 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:31:17 np0005592767 nova_compute[182623]: 2026-01-22 22:31:17.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.038 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.925 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.925 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:19 np0005592767 nova_compute[182623]: 2026-01-22 22:31:19.925 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:31:20 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:31:20 np0005592767 systemd[223124]: Activating special unit Exit the Session...
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped target Main User Target.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped target Basic System.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped target Paths.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped target Sockets.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped target Timers.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:31:20 np0005592767 systemd[223124]: Closed D-Bus User Message Bus Socket.
Jan 22 17:31:20 np0005592767 systemd[223124]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:31:20 np0005592767 systemd[223124]: Removed slice User Application Slice.
Jan 22 17:31:20 np0005592767 systemd[223124]: Reached target Shutdown.
Jan 22 17:31:20 np0005592767 systemd[223124]: Finished Exit the Session.
Jan 22 17:31:20 np0005592767 systemd[223124]: Reached target Exit the Session.
Jan 22 17:31:20 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:31:20 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:31:20 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:31:20 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:31:20 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:31:20 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:31:20 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.928 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.929 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.929 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.929 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:31:20 np0005592767 nova_compute[182623]: 2026-01-22 22:31:20.958 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.071 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.072 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.2359619140625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.073 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.073 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.104 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration for instance 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.122 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating resource usage from migration e2e35cdb-2b92-4aaa-a145-2b0519b9b28b#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.122 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Starting to track incoming migration e2e35cdb-2b92-4aaa-a145-2b0519b9b28b with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.158 182627 WARNING nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.158 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.159 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.190 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.202 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.223 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:31:21 np0005592767 nova_compute[182623]: 2026-01-22 22:31:21.223 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:22 np0005592767 podman[223213]: 2026-01-22 22:31:22.126465512 +0000 UTC m=+0.049088372 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:31:22 np0005592767 podman[223214]: 2026-01-22 22:31:22.154189668 +0000 UTC m=+0.072446964 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.225 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.668 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.668 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.688 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.702 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.702 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.723 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.800 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.801 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.822 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.822 182627 INFO nova.compute.claims [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:31:22 np0005592767 nova_compute[182623]: 2026-01-22 22:31:22.863 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.022 182627 DEBUG nova.compute.provider_tree [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.041 182627 DEBUG nova.scheduler.client.report [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.068 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.069 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.072 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.080 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.080 182627 INFO nova.compute.claims [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.162 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.163 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.182 182627 INFO nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.221 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.291 182627 DEBUG nova.compute.manager [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.291 182627 DEBUG oslo_concurrency.lockutils [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.292 182627 DEBUG oslo_concurrency.lockutils [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.293 182627 DEBUG oslo_concurrency.lockutils [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.293 182627 DEBUG nova.compute.manager [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.294 182627 WARNING nova.compute.manager [req-887d7d06-9739-432e-8845-b9f3feaf00a8 req-dbcf99fb-cc80-4462-9cd7-a162ffcd854c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received unexpected event network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.297 182627 DEBUG nova.compute.provider_tree [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.321 182627 DEBUG nova.scheduler.client.report [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.343 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.344 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.344 182627 INFO nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Creating image(s)#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.345 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.345 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.346 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.361 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.362 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.365 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.419 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.419 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.426 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.427 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.427 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.440 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.460 182627 DEBUG nova.policy [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.464 182627 INFO nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.485 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.493 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.493 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.526 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.528 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.528 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.588 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.589 182627 DEBUG nova.virt.disk.api [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Checking if we can resize image /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.589 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.639 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.641 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.641 182627 INFO nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Creating image(s)#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.642 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.643 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.643 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.661 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.662 182627 DEBUG nova.virt.disk.api [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Cannot resize image /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.662 182627 DEBUG nova.objects.instance [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'migration_context' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.663 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.693 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.694 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Ensure instance console log exists: /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.694 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.695 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.695 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.725 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.726 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.726 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.742 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.803 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.804 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.836 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.838 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.838 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.912 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.913 182627 DEBUG nova.virt.disk.api [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Checking if we can resize image /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.913 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.988 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.989 182627 DEBUG nova.virt.disk.api [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Cannot resize image /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:31:23 np0005592767 nova_compute[182623]: 2026-01-22 22:31:23.990 182627 DEBUG nova.objects.instance [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'migration_context' on Instance uuid aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.003 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.003 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Ensure instance console log exists: /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.004 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.004 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.005 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:24 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:31:24 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:31:24 np0005592767 systemd-logind[802]: New session 55 of user nova.
Jan 22 17:31:24 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:31:24 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:31:24 np0005592767 systemd[223290]: Queued start job for default target Main User Target.
Jan 22 17:31:24 np0005592767 systemd[223290]: Created slice User Application Slice.
Jan 22 17:31:24 np0005592767 systemd[223290]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:31:24 np0005592767 systemd[223290]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:31:24 np0005592767 systemd[223290]: Reached target Paths.
Jan 22 17:31:24 np0005592767 systemd[223290]: Reached target Timers.
Jan 22 17:31:24 np0005592767 systemd[223290]: Starting D-Bus User Message Bus Socket...
Jan 22 17:31:24 np0005592767 systemd[223290]: Starting Create User's Volatile Files and Directories...
Jan 22 17:31:24 np0005592767 systemd[223290]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:31:24 np0005592767 systemd[223290]: Reached target Sockets.
Jan 22 17:31:24 np0005592767 systemd[223290]: Finished Create User's Volatile Files and Directories.
Jan 22 17:31:24 np0005592767 systemd[223290]: Reached target Basic System.
Jan 22 17:31:24 np0005592767 systemd[223290]: Reached target Main User Target.
Jan 22 17:31:24 np0005592767 systemd[223290]: Startup finished in 113ms.
Jan 22 17:31:24 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:31:24 np0005592767 systemd[1]: Started Session 55 of User nova.
Jan 22 17:31:24 np0005592767 nova_compute[182623]: 2026-01-22 22:31:24.592 182627 DEBUG nova.policy [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:31:24 np0005592767 systemd[1]: session-55.scope: Deactivated successfully.
Jan 22 17:31:24 np0005592767 systemd-logind[802]: Session 55 logged out. Waiting for processes to exit.
Jan 22 17:31:24 np0005592767 systemd-logind[802]: Removed session 55.
Jan 22 17:31:24 np0005592767 systemd-logind[802]: New session 57 of user nova.
Jan 22 17:31:24 np0005592767 systemd[1]: Started Session 57 of User nova.
Jan 22 17:31:24 np0005592767 systemd[1]: session-57.scope: Deactivated successfully.
Jan 22 17:31:24 np0005592767 systemd-logind[802]: Session 57 logged out. Waiting for processes to exit.
Jan 22 17:31:24 np0005592767 systemd-logind[802]: Removed session 57.
Jan 22 17:31:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:24Z|00315|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 17:31:25 np0005592767 systemd-logind[802]: New session 58 of user nova.
Jan 22 17:31:25 np0005592767 systemd[1]: Started Session 58 of User nova.
Jan 22 17:31:25 np0005592767 systemd[1]: session-58.scope: Deactivated successfully.
Jan 22 17:31:25 np0005592767 systemd-logind[802]: Session 58 logged out. Waiting for processes to exit.
Jan 22 17:31:25 np0005592767 systemd-logind[802]: Removed session 58.
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.268 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Successfully created port: c2644a02-280b-410f-a2c6-37acfe5c15da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.399 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Successfully created port: 4f4f1de0-a1fa-42ab-98de-698c12368baf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.408 182627 DEBUG nova.compute.manager [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.409 182627 DEBUG oslo_concurrency.lockutils [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.409 182627 DEBUG oslo_concurrency.lockutils [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.410 182627 DEBUG oslo_concurrency.lockutils [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.410 182627 DEBUG nova.compute.manager [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.410 182627 WARNING nova.compute.manager [req-055b5e71-80b2-4883-bc5c-6fb2fdb9f6ac req-09bcdb72-8e57-4b73-bc02-92886d07262d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received unexpected event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:31:25 np0005592767 nova_compute[182623]: 2026-01-22 22:31:25.993 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:26 np0005592767 nova_compute[182623]: 2026-01-22 22:31:26.143 182627 INFO nova.network.neutron [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating port 2b7f2db2-eef8-44dc-8de4-375eff38c764 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.154 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.155 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquired lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.155 182627 DEBUG nova.network.neutron [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.270 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Successfully updated port: c2644a02-280b-410f-a2c6-37acfe5c15da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.300 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.300 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.300 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.346 182627 DEBUG nova.compute.manager [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-changed-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.347 182627 DEBUG nova.compute.manager [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Refreshing instance network info cache due to event network-changed-2b7f2db2-eef8-44dc-8de4-375eff38c764. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.347 182627 DEBUG oslo_concurrency.lockutils [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.349 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Successfully updated port: 4f4f1de0-a1fa-42ab-98de-698c12368baf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.366 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.367 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquired lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.367 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.513 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.579 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:31:27 np0005592767 nova_compute[182623]: 2026-01-22 22:31:27.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.592 182627 DEBUG nova.network.neutron [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Updating instance_info_cache with network_info: [{"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.617 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Releasing lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.617 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Instance network_info: |[{"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.619 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Start _get_guest_xml network_info=[{"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.623 182627 WARNING nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.629 182627 DEBUG nova.virt.libvirt.host [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.630 182627 DEBUG nova.virt.libvirt.host [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.635 182627 DEBUG nova.virt.libvirt.host [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.637 182627 DEBUG nova.virt.libvirt.host [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.638 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.638 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.639 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.639 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.639 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.639 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.640 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.640 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.640 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.640 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.640 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.641 182627 DEBUG nova.virt.hardware [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.644 182627 DEBUG nova.virt.libvirt.vif [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1126799150',display_name='tempest-ListServerFiltersTestJSON-instance-1126799150',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1126799150',id=90,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-45ry5zcn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:23Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=aa33ef57-9092-4a0a-bf8f-fd0041ab60e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.644 182627 DEBUG nova.network.os_vif_util [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.645 182627 DEBUG nova.network.os_vif_util [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.646 182627 DEBUG nova.objects.instance [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'pci_devices' on Instance uuid aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.669 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <uuid>aa33ef57-9092-4a0a-bf8f-fd0041ab60e7</uuid>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <name>instance-0000005a</name>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1126799150</nova:name>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:31:28</nova:creationTime>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:user uuid="b6f50d0e6a7444f0ac9c928363915afb">tempest-ListServerFiltersTestJSON-1169398826-project-member</nova:user>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:project uuid="802c49a328ca49e3a4ea4e46b9a9f5eb">tempest-ListServerFiltersTestJSON-1169398826</nova:project>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        <nova:port uuid="4f4f1de0-a1fa-42ab-98de-698c12368baf">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="serial">aa33ef57-9092-4a0a-bf8f-fd0041ab60e7</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="uuid">aa33ef57-9092-4a0a-bf8f-fd0041ab60e7</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.config"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:ee:99:5c"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <target dev="tap4f4f1de0-a1"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/console.log" append="off"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:31:28 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:31:28 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:31:28 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:31:28 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.670 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Preparing to wait for external event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.670 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.670 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.671 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.671 182627 DEBUG nova.virt.libvirt.vif [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1126799150',display_name='tempest-ListServerFiltersTestJSON-instance-1126799150',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1126799150',id=90,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-45ry5zcn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:23Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=aa33ef57-9092-4a0a-bf8f-fd0041ab60e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.672 182627 DEBUG nova.network.os_vif_util [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.672 182627 DEBUG nova.network.os_vif_util [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.673 182627 DEBUG os_vif [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.673 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.673 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.674 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.676 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.677 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f4f1de0-a1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.677 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f4f1de0-a1, col_values=(('external_ids', {'iface-id': '4f4f1de0-a1fa-42ab-98de-698c12368baf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:99:5c', 'vm-uuid': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.679 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:28 np0005592767 NetworkManager[54973]: <info>  [1769121088.6799] manager: (tap4f4f1de0-a1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.681 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.686 182627 INFO os_vif [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1')#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.734 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.734 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.734 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] No VIF found with MAC fa:16:3e:ee:99:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.734 182627 INFO nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Using config drive#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.957 182627 DEBUG nova.compute.manager [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-changed-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.957 182627 DEBUG nova.compute.manager [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Refreshing instance network info cache due to event network-changed-4f4f1de0-a1fa-42ab-98de-698c12368baf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.957 182627 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.958 182627 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:28 np0005592767 nova_compute[182623]: 2026-01-22 22:31:28.958 182627 DEBUG nova.network.neutron [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Refreshing network info cache for port 4f4f1de0-a1fa-42ab-98de-698c12368baf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:31:29 np0005592767 podman[223319]: 2026-01-22 22:31:29.059643159 +0000 UTC m=+0.069942903 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.429 182627 DEBUG nova.compute.manager [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.430 182627 DEBUG nova.compute.manager [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing instance network info cache due to event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.431 182627 DEBUG oslo_concurrency.lockutils [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.662 182627 DEBUG nova.network.neutron [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.668 182627 DEBUG nova.network.neutron [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating instance_info_cache with network_info: [{"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.688 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Releasing lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.694 182627 DEBUG oslo_concurrency.lockutils [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.695 182627 DEBUG nova.network.neutron [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Refreshing network info cache for port 2b7f2db2-eef8-44dc-8de4-375eff38c764 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.698 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.698 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance network_info: |[{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.699 182627 DEBUG oslo_concurrency.lockutils [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.700 182627 DEBUG nova.network.neutron [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.705 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start _get_guest_xml network_info=[{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.717 182627 WARNING nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.723 182627 DEBUG nova.virt.libvirt.host [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.724 182627 DEBUG nova.virt.libvirt.host [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.728 182627 DEBUG nova.virt.libvirt.host [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.729 182627 DEBUG nova.virt.libvirt.host [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.730 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.731 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.732 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.732 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.733 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.733 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.734 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.734 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.734 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.735 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.735 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.736 182627 DEBUG nova.virt.hardware [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.742 182627 DEBUG nova.virt.libvirt.vif [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1902499833',display_name='tempest-ServerRescueTestJSONUnderV235-server-1902499833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1902499833',id=89,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a2c265c40d2b4195b882f2503b5ebd3c',ramdisk_id='',reservation_id='r-112y35u5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1110728559',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1110728559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:23Z,user_data=None,user_id='03f7ced9a7ee47849ffa16934d67478e',uuid=66bd6e4e-3db5-45d3-8495-bb100526e6a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.743 182627 DEBUG nova.network.os_vif_util [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converting VIF {"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.744 182627 DEBUG nova.network.os_vif_util [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.745 182627 DEBUG nova.objects.instance [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.760 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <uuid>66bd6e4e-3db5-45d3-8495-bb100526e6a2</uuid>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <name>instance-00000059</name>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1902499833</nova:name>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:31:29</nova:creationTime>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:user uuid="03f7ced9a7ee47849ffa16934d67478e">tempest-ServerRescueTestJSONUnderV235-1110728559-project-member</nova:user>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:project uuid="a2c265c40d2b4195b882f2503b5ebd3c">tempest-ServerRescueTestJSONUnderV235-1110728559</nova:project>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        <nova:port uuid="c2644a02-280b-410f-a2c6-37acfe5c15da">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="serial">66bd6e4e-3db5-45d3-8495-bb100526e6a2</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="uuid">66bd6e4e-3db5-45d3-8495-bb100526e6a2</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:d3:29:6b"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <target dev="tapc2644a02-28"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/console.log" append="off"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:31:29 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:31:29 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:31:29 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:31:29 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.762 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Preparing to wait for external event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.763 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.763 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.763 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.764 182627 DEBUG nova.virt.libvirt.vif [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1902499833',display_name='tempest-ServerRescueTestJSONUnderV235-server-1902499833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1902499833',id=89,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a2c265c40d2b4195b882f2503b5ebd3c',ramdisk_id='',reservation_id='r-112y35u5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1110728559',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1110728559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:23Z,user_data=None,user_id='03f7ced9a7ee47849ffa16934d67478e',uuid=66bd6e4e-3db5-45d3-8495-bb100526e6a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.764 182627 DEBUG nova.network.os_vif_util [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converting VIF {"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.765 182627 DEBUG nova.network.os_vif_util [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.765 182627 DEBUG os_vif [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.767 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.767 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.767 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.773 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.773 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2644a02-28, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.774 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2644a02-28, col_values=(('external_ids', {'iface-id': 'c2644a02-280b-410f-a2c6-37acfe5c15da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:29:6b', 'vm-uuid': '66bd6e4e-3db5-45d3-8495-bb100526e6a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.776 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:31:29 np0005592767 NetworkManager[54973]: <info>  [1769121089.7769] manager: (tapc2644a02-28): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.790 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.791 182627 INFO os_vif [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28')#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.826 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.828 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.829 182627 INFO nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Creating image(s)#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.830 182627 DEBUG nova.objects.instance [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.849 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.881 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.882 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.882 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No VIF found with MAC fa:16:3e:d3:29:6b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.883 182627 INFO nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Using config drive#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.927 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.928 182627 DEBUG nova.virt.disk.api [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Checking if we can resize image /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:31:29 np0005592767 nova_compute[182623]: 2026-01-22 22:31:29.929 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.010 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.013 182627 DEBUG nova.virt.disk.api [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Cannot resize image /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.034 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.035 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Ensure instance console log exists: /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.036 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.036 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.036 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.041 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Start _get_guest_xml network_info=[{"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:b6:86:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.047 182627 WARNING nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.055 182627 DEBUG nova.virt.libvirt.host [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.056 182627 DEBUG nova.virt.libvirt.host [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.060 182627 DEBUG nova.virt.libvirt.host [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.061 182627 DEBUG nova.virt.libvirt.host [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.062 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.063 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.063 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.063 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.064 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.064 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.064 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.065 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.065 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.065 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.066 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.066 182627 DEBUG nova.virt.hardware [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.066 182627 DEBUG nova.objects.instance [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.082 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.145 182627 DEBUG oslo_concurrency.processutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.147 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.148 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.149 182627 DEBUG oslo_concurrency.lockutils [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.150 182627 DEBUG nova.virt.libvirt.vif [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1655516205',display_name='tempest-ServerDiskConfigTestJSON-server-1655516205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1655516205',id=86,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-18l1vozy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:25Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:b6:86:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.151 182627 DEBUG nova.network.os_vif_util [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:b6:86:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.152 182627 DEBUG nova.network.os_vif_util [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.155 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <uuid>9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2</uuid>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <name>instance-00000056</name>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <memory>196608</memory>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1655516205</nova:name>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:31:30</nova:creationTime>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.micro">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:memory>192</nova:memory>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:user uuid="b08cde28781a46649c6528e52d00b1c1">tempest-ServerDiskConfigTestJSON-973240997-project-member</nova:user>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:project uuid="708eb5a130224bd188eae5ec27c67df5">tempest-ServerDiskConfigTestJSON-973240997</nova:project>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        <nova:port uuid="2b7f2db2-eef8-44dc-8de4-375eff38c764">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="serial">9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="uuid">9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/disk.config"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b6:86:9e"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <target dev="tap2b7f2db2-ee"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2/console.log" append="off"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:31:30 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:31:30 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:31:30 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:31:30 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.157 182627 DEBUG nova.virt.libvirt.vif [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1655516205',display_name='tempest-ServerDiskConfigTestJSON-server-1655516205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1655516205',id=86,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-18l1vozy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:25Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:b6:86:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.157 182627 DEBUG nova.network.os_vif_util [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "vif_mac": "fa:16:3e:b6:86:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.158 182627 DEBUG nova.network.os_vif_util [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.159 182627 DEBUG os_vif [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.160 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.160 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.161 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.163 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b7f2db2-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.164 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b7f2db2-ee, col_values=(('external_ids', {'iface-id': '2b7f2db2-eef8-44dc-8de4-375eff38c764', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:86:9e', 'vm-uuid': '9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.1673] manager: (tap2b7f2db2-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.169 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.174 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.175 182627 INFO os_vif [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee')#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.214 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.215 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.215 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] No VIF found with MAC fa:16:3e:b6:86:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.216 182627 INFO nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Using config drive#033[00m
Jan 22 17:31:30 np0005592767 kernel: tap2b7f2db2-ee: entered promiscuous mode
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.2774] manager: (tap2b7f2db2-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.287 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00316|binding|INFO|Claiming lport 2b7f2db2-eef8-44dc-8de4-375eff38c764 for this chassis.
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00317|binding|INFO|2b7f2db2-eef8-44dc-8de4-375eff38c764: Claiming fa:16:3e:b6:86:9e 10.100.0.4
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.301 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:86:9e 10.100.0.4'], port_security=['fa:16:3e:b6:86:9e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2b7f2db2-eef8-44dc-8de4-375eff38c764) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.302 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2b7f2db2-eef8-44dc-8de4-375eff38c764 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 bound to our chassis#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.303 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 354683a7-3755-487f-b5f4-0a224cbf99c3#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.313 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a3366186-3556-47b8-9939-b4c6054c8cc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.314 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap354683a7-31 in ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:31:30 np0005592767 systemd-udevd[223376]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.317 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap354683a7-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.318 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[08f30dfc-de28-4a6b-88c0-5a50108f549f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.319 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bc5f48-390b-4439-b8a9-f2d3969fcee6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.328 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[18945f6d-3a93-49df-ba0b-b029da04c3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 systemd-machined[153912]: New machine qemu-41-instance-00000056.
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.3408] device (tap2b7f2db2-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.3420] device (tap2b7f2db2-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:30 np0005592767 systemd[1]: Started Virtual Machine qemu-41-instance-00000056.
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.375 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5e4dab-1b35-4327-ba5b-e8fedeaafbd8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.391 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00318|binding|INFO|Setting lport 2b7f2db2-eef8-44dc-8de4-375eff38c764 ovn-installed in OVS
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00319|binding|INFO|Setting lport 2b7f2db2-eef8-44dc-8de4-375eff38c764 up in Southbound
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.393 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.414 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e89f45f0-91fa-4303-a724-7c6c9f1b2130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.420 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b54b2e-c67d-4de9-ab3e-5157ece18951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.4223] manager: (tap354683a7-30): new Veth device (/org/freedesktop/NetworkManager/Devices/156)
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.466 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0b223fe4-b23f-405e-811e-44537ecd8d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.471 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[496c0d43-9b06-433f-aefe-2b0d5c95fcff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.4983] device (tap354683a7-30): carrier: link connected
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.506 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[789c9207-a435-4e5e-9270-6be3509cbc50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.524 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bab40abf-ae04-470e-a000-edecdb0a1102]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468707, 'reachable_time': 31176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223417, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.538 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[22d4dcc5-e2d2-4f11-a25e-7d44b73c6e08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:a91e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468707, 'tstamp': 468707}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223418, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.548 182627 INFO nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Creating config drive at /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.config#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.555 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8xduzgu7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.557 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5c830c41-5183-4965-bc4b-b786006ed071]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap354683a7-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:a9:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468707, 'reachable_time': 31176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223419, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.597 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03caa116-7f0e-4032-bd10-ee1faa934671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.672 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[73ce4cf2-117d-4803-8871-1e12b3590d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.674 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.674 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.675 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap354683a7-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.682 182627 DEBUG oslo_concurrency.processutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8xduzgu7" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.725 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 kernel: tap354683a7-30: entered promiscuous mode
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.7275] manager: (tap354683a7-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.7305] manager: (tap4f4f1de0-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 22 17:31:30 np0005592767 systemd-udevd[223396]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:30 np0005592767 kernel: tap4f4f1de0-a1: entered promiscuous mode
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.734 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00320|binding|INFO|Claiming lport 4f4f1de0-a1fa-42ab-98de-698c12368baf for this chassis.
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00321|binding|INFO|4f4f1de0-a1fa-42ab-98de-698c12368baf: Claiming fa:16:3e:ee:99:5c 10.100.0.4
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.736 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap354683a7-30, col_values=(('external_ids', {'iface-id': 'c23cb3b6-ac49-408f-91d6-6f81f37b4f6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00322|binding|INFO|Releasing lport c23cb3b6-ac49-408f-91d6-6f81f37b4f6f from this chassis (sb_readonly=1)
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.743 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:99:5c 10.100.0.4'], port_security=['fa:16:3e:ee:99:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4f4f1de0-a1fa-42ab-98de-698c12368baf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.7462] device (tap4f4f1de0-a1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:30 np0005592767 NetworkManager[54973]: <info>  [1769121090.7478] device (tap4f4f1de0-a1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.773 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.774 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.776 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[88dde098-a770-43d0-b67c-ea2207270496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.777 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/354683a7-3755-487f-b5f4-0a224cbf99c3.pid.haproxy
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 354683a7-3755-487f-b5f4-0a224cbf99c3
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:31:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:30.778 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'env', 'PROCESS_TAG=haproxy-354683a7-3755-487f-b5f4-0a224cbf99c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/354683a7-3755-487f-b5f4-0a224cbf99c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 systemd-machined[153912]: New machine qemu-42-instance-0000005a.
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00323|binding|INFO|Setting lport 4f4f1de0-a1fa-42ab-98de-698c12368baf ovn-installed in OVS
Jan 22 17:31:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:30Z|00324|binding|INFO|Setting lport 4f4f1de0-a1fa-42ab-98de-698c12368baf up in Southbound
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.790 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:30 np0005592767 systemd[1]: Started Virtual Machine qemu-42-instance-0000005a.
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.795 182627 DEBUG nova.compute.manager [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.795 182627 DEBUG oslo_concurrency.lockutils [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.795 182627 DEBUG oslo_concurrency.lockutils [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.796 182627 DEBUG oslo_concurrency.lockutils [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.796 182627 DEBUG nova.compute.manager [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.796 182627 WARNING nova.compute.manager [req-e54fca06-0fb5-48d6-94c6-04833479aa94 req-2f03c91a-c955-4bdd-842e-b3aa92f3d03c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received unexpected event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.873 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121090.8731048, 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.874 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.876 182627 DEBUG nova.compute.manager [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.881 182627 INFO nova.virt.libvirt.driver [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Instance running successfully.#033[00m
Jan 22 17:31:30 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.884 182627 DEBUG nova.virt.libvirt.guest [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.885 182627 DEBUG nova.virt.libvirt.driver [None req-d4b33737-bc5a-404b-b74b-1eeed83c4f4a b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.903 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.915 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.965 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.966 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121090.873738, 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.967 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] VM Started (Lifecycle Event)#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.993 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:30 np0005592767 nova_compute[182623]: 2026-01-22 22:31:30.998 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.001 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:31 np0005592767 podman[223484]: 2026-01-22 22:31:31.223545721 +0000 UTC m=+0.068784330 container create 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:31:31 np0005592767 systemd[1]: Started libpod-conmon-9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269.scope.
Jan 22 17:31:31 np0005592767 podman[223484]: 2026-01-22 22:31:31.18325635 +0000 UTC m=+0.028495049 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:31:31 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:31:31 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d1d6aed9403eb01baf0cd36a12e299f3a8542e761a5e22d86d47b71f6d21422/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:31:31 np0005592767 podman[223484]: 2026-01-22 22:31:31.328016122 +0000 UTC m=+0.173254781 container init 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:31:31 np0005592767 podman[223484]: 2026-01-22 22:31:31.335346799 +0000 UTC m=+0.180585418 container start 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:31:31 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [NOTICE]   (223511) : New worker (223514) forked
Jan 22 17:31:31 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [NOTICE]   (223511) : Loading success.
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.379 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121091.3790376, aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.380 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] VM Started (Lifecycle Event)#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.406 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.412 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121091.3793356, aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.412 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.419 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4f4f1de0-a1fa-42ab-98de-698c12368baf in datapath f234f62b-5371-4527-94e7-91cf5da3055e unbound from our chassis#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.424 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f234f62b-5371-4527-94e7-91cf5da3055e#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.436 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.437 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d999e0f9-b4b4-4753-aa94-bc99e42fcea2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.441 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.444 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf234f62b-51 in ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.446 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf234f62b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.447 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d5d07a-c080-482f-bbcb-27458adcf3b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.448 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f36fb9ce-27e1-4f36-9dbb-c689d13e10ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.462 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[fceeb8ae-3125-4d0e-b43b-ac7d13d94e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.464 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.478 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[edca969c-dc35-46cb-866e-efb46b559411]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.506 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b40208-23b8-419a-b083-c8a7c67f59b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.517 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbe57ec-0212-4dde-82bd-27dc4289a98b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.5189] manager: (tapf234f62b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.550 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a7137e6a-f00f-445a-9ffa-76982df330d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.554 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3b42b1-b8a0-470e-afa6-2af6053e7c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.5912] device (tapf234f62b-50): carrier: link connected
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.599 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e9e2a7-c812-496c-8791-fcad9714651d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.615 182627 INFO nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Creating config drive at /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.621 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpai400ox4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.628 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3a53b928-e622-4092-b1a5-f6588d61e11d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468816, 'reachable_time': 25150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223539, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.654 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c60aa0f-923c-49b9-8d06-613a477afcaa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:3df6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468816, 'tstamp': 468816}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223541, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.682 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b148cc-4d2e-48f6-8ec2-c9652af85b3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf234f62b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:3d:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468816, 'reachable_time': 25150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223544, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.719 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a45193c-6687-48e2-a7a4-a0b15012f8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.748 182627 DEBUG oslo_concurrency.processutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpai400ox4" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.803 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0110a1aa-9227-43e0-a697-c02352690c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.806 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.806 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.807 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf234f62b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:31 np0005592767 kernel: tapc2644a02-28: entered promiscuous mode
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.8280] manager: (tapc2644a02-28): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.8559] device (tapc2644a02-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.8565] device (tapc2644a02-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:31Z|00325|binding|INFO|Claiming lport c2644a02-280b-410f-a2c6-37acfe5c15da for this chassis.
Jan 22 17:31:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:31Z|00326|binding|INFO|c2644a02-280b-410f-a2c6-37acfe5c15da: Claiming fa:16:3e:d3:29:6b 10.100.0.14
Jan 22 17:31:31 np0005592767 NetworkManager[54973]: <info>  [1769121091.8683] manager: (tapf234f62b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.869 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.873 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:29:6b 10.100.0.14'], port_security=['fa:16:3e:d3:29:6b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6168b412-0d9d-447a-9f39-23f5915a9dfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '731874d8-38e6-4c19-a4b9-1132d12c448d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89edec45-9e1e-4610-916c-77f10f88a664, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c2644a02-280b-410f-a2c6-37acfe5c15da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:31 np0005592767 systemd-machined[153912]: New machine qemu-43-instance-00000059.
Jan 22 17:31:31 np0005592767 kernel: tapf234f62b-50: entered promiscuous mode
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.915 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.919 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf234f62b-50, col_values=(('external_ids', {'iface-id': '0a1fd4a8-b506-4c9d-9846-1c0ab542e465'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.920 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:31Z|00327|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da ovn-installed in OVS
Jan 22 17:31:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:31Z|00328|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da up in Southbound
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:31Z|00329|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=1)
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.927 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.926 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:31:31 np0005592767 systemd[1]: Started Virtual Machine qemu-43-instance-00000059.
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.930 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7b59be-c9b2-49f0-b0a4-775fc1d1545e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.931 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/f234f62b-5371-4527-94e7-91cf5da3055e.pid.haproxy
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID f234f62b-5371-4527-94e7-91cf5da3055e
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:31:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:31.932 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'env', 'PROCESS_TAG=haproxy-f234f62b-5371-4527-94e7-91cf5da3055e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f234f62b-5371-4527-94e7-91cf5da3055e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:31:31 np0005592767 nova_compute[182623]: 2026-01-22 22:31:31.937 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:32 np0005592767 podman[223598]: 2026-01-22 22:31:32.358245071 +0000 UTC m=+0.049116092 container create 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:31:32 np0005592767 systemd[1]: Started libpod-conmon-3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f.scope.
Jan 22 17:31:32 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:31:32 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9da1943a40f2d669f980b742c92b3cd56e4760d971e710d8fe2af8abf063e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:31:32 np0005592767 podman[223598]: 2026-01-22 22:31:32.334634852 +0000 UTC m=+0.025505883 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:31:32 np0005592767 podman[223598]: 2026-01-22 22:31:32.443918619 +0000 UTC m=+0.134789650 container init 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:31:32 np0005592767 podman[223598]: 2026-01-22 22:31:32.449174928 +0000 UTC m=+0.140045949 container start 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:31:32 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [NOTICE]   (223617) : New worker (223623) forked
Jan 22 17:31:32 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [NOTICE]   (223617) : Loading success.
Jan 22 17:31:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:32.508 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c2644a02-280b-410f-a2c6-37acfe5c15da in datapath 6168b412-0d9d-447a-9f39-23f5915a9dfa unbound from our chassis#033[00m
Jan 22 17:31:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:32.509 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6168b412-0d9d-447a-9f39-23f5915a9dfa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:31:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:32.510 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c524870-959b-4ee9-af87-a3a96f35842c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.568 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121092.5679977, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.568 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Started (Lifecycle Event)#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.618 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.626 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121092.5680933, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.626 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.652 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.656 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:32 np0005592767 nova_compute[182623]: 2026-01-22 22:31:32.708 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.170 182627 DEBUG nova.network.neutron [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Updated VIF entry in instance network info cache for port 4f4f1de0-a1fa-42ab-98de-698c12368baf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.171 182627 DEBUG nova.network.neutron [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Updating instance_info_cache with network_info: [{"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.200 182627 DEBUG oslo_concurrency.lockutils [req-09f01266-067b-484d-bba5-44f04a8da855 req-d7571067-7611-4cc8-9fd9-4b73ddb54bf7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.412 182627 DEBUG nova.compute.manager [req-ff630ca6-fe5e-4afa-b291-41888d688fbe req-cf4b2cac-c5c7-49de-9061-cb69506fe68c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.413 182627 DEBUG oslo_concurrency.lockutils [req-ff630ca6-fe5e-4afa-b291-41888d688fbe req-cf4b2cac-c5c7-49de-9061-cb69506fe68c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.414 182627 DEBUG oslo_concurrency.lockutils [req-ff630ca6-fe5e-4afa-b291-41888d688fbe req-cf4b2cac-c5c7-49de-9061-cb69506fe68c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.414 182627 DEBUG oslo_concurrency.lockutils [req-ff630ca6-fe5e-4afa-b291-41888d688fbe req-cf4b2cac-c5c7-49de-9061-cb69506fe68c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.414 182627 DEBUG nova.compute.manager [req-ff630ca6-fe5e-4afa-b291-41888d688fbe req-cf4b2cac-c5c7-49de-9061-cb69506fe68c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Processing event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.415 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.419 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121093.4191651, aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.419 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.421 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.426 182627 INFO nova.virt.libvirt.driver [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Instance spawned successfully.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.427 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.462 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.462 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.463 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.463 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.464 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.464 182627 DEBUG nova.virt.libvirt.driver [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.468 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.471 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.509 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.596 182627 INFO nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Took 9.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.596 182627 DEBUG nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.815 182627 INFO nova.compute.manager [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Took 11.03 seconds to build instance.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.900 182627 DEBUG nova.compute.manager [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.900 182627 DEBUG oslo_concurrency.lockutils [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.901 182627 DEBUG oslo_concurrency.lockutils [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.901 182627 DEBUG oslo_concurrency.lockutils [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.901 182627 DEBUG nova.compute.manager [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.901 182627 WARNING nova.compute.manager [req-bc87f659-00a3-45a3-a7a2-bc33dd48c73f req-77a23ea3-dc88-4593-bec5-6371c5bb388d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received unexpected event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with vm_state resized and task_state None.#033[00m
Jan 22 17:31:33 np0005592767 nova_compute[182623]: 2026-01-22 22:31:33.904 182627 DEBUG oslo_concurrency.lockutils [None req-8eba9c73-e48f-45b9-925c-aabf7dd1cf95 b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:35 np0005592767 nova_compute[182623]: 2026-01-22 22:31:35.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:35 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:31:35 np0005592767 systemd[223290]: Activating special unit Exit the Session...
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped target Main User Target.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped target Basic System.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped target Paths.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped target Sockets.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped target Timers.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:31:35 np0005592767 systemd[223290]: Closed D-Bus User Message Bus Socket.
Jan 22 17:31:35 np0005592767 systemd[223290]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:31:35 np0005592767 systemd[223290]: Removed slice User Application Slice.
Jan 22 17:31:35 np0005592767 systemd[223290]: Reached target Shutdown.
Jan 22 17:31:35 np0005592767 systemd[223290]: Finished Exit the Session.
Jan 22 17:31:35 np0005592767 systemd[223290]: Reached target Exit the Session.
Jan 22 17:31:35 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:31:35 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:31:35 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:31:35 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:31:35 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:31:35 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:31:35 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.349 182627 DEBUG nova.network.neutron [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated VIF entry in instance network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.350 182627 DEBUG nova.network.neutron [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.379 182627 DEBUG oslo_concurrency.lockutils [req-9a6f66d0-4736-4320-8514-ed8f556316d3 req-f4a3d4c3-7600-47f8-b2a5-16d7f4d2c293 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.460 182627 DEBUG nova.network.neutron [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updated VIF entry in instance network info cache for port 2b7f2db2-eef8-44dc-8de4-375eff38c764. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.461 182627 DEBUG nova.network.neutron [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating instance_info_cache with network_info: [{"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.484 182627 DEBUG oslo_concurrency.lockutils [req-8c1cb65e-06ef-4eb5-bf1d-70772aaf8145 req-2a8572ff-1897-4fb8-8cf4-b213d95ce59f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.576 182627 DEBUG nova.compute.manager [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.576 182627 DEBUG oslo_concurrency.lockutils [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.576 182627 DEBUG oslo_concurrency.lockutils [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.577 182627 DEBUG oslo_concurrency.lockutils [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.577 182627 DEBUG nova.compute.manager [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] No waiting events found dispatching network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:36 np0005592767 nova_compute[182623]: 2026-01-22 22:31:36.577 182627 WARNING nova.compute.manager [req-7ea47aee-3cc0-4d39-8a83-a054087f2133 req-fad4a3f1-0f54-4a6f-b228-9192f90ede69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received unexpected event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf for instance with vm_state active and task_state None.#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.971 182627 DEBUG nova.compute.manager [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.972 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.972 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.972 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.973 182627 DEBUG nova.compute.manager [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Processing event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.973 182627 DEBUG nova.compute.manager [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.973 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.974 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.974 182627 DEBUG oslo_concurrency.lockutils [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.974 182627 DEBUG nova.compute.manager [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.975 182627 WARNING nova.compute.manager [req-e573a288-ce84-43ae-a2e0-fc883a80c686 req-dd8647ac-e2d0-4d3f-86fe-417309efe6a0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.975 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.981 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121098.9809873, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.981 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.983 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.986 182627 INFO nova.virt.libvirt.driver [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance spawned successfully.#033[00m
Jan 22 17:31:38 np0005592767 nova_compute[182623]: 2026-01-22 22:31:38.987 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.015 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.015 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.017 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.018 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.018 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.019 182627 DEBUG nova.virt.libvirt.driver [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.023 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.026 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.047 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.102 182627 INFO nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Took 15.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.102 182627 DEBUG nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.197 182627 INFO nova.compute.manager [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Took 16.42 seconds to build instance.#033[00m
Jan 22 17:31:39 np0005592767 nova_compute[182623]: 2026-01-22 22:31:39.227 182627 DEBUG oslo_concurrency.lockutils [None req-48e5b602-f641-4d74-a702-84214c1e9588 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:40 np0005592767 nova_compute[182623]: 2026-01-22 22:31:40.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:41 np0005592767 nova_compute[182623]: 2026-01-22 22:31:41.001 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:41 np0005592767 podman[223636]: 2026-01-22 22:31:41.151735228 +0000 UTC m=+0.073213556 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 17:31:41 np0005592767 nova_compute[182623]: 2026-01-22 22:31:41.231 182627 INFO nova.compute.manager [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Rescuing#033[00m
Jan 22 17:31:41 np0005592767 nova_compute[182623]: 2026-01-22 22:31:41.233 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:31:41 np0005592767 nova_compute[182623]: 2026-01-22 22:31:41.233 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:31:41 np0005592767 nova_compute[182623]: 2026-01-22 22:31:41.233 182627 DEBUG nova.network.neutron [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:31:42 np0005592767 nova_compute[182623]: 2026-01-22 22:31:42.982 182627 DEBUG nova.network.neutron [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:43 np0005592767 nova_compute[182623]: 2026-01-22 22:31:43.025 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:31:43 np0005592767 nova_compute[182623]: 2026-01-22 22:31:43.386 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:31:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:44Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b6:86:9e 10.100.0.4
Jan 22 17:31:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:44Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:86:9e 10.100.0.4
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.873 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.874 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.875 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.876 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.876 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.891 182627 INFO nova.compute.manager [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Terminating instance#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.909 182627 DEBUG nova.compute.manager [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:31:44 np0005592767 kernel: tap2b7f2db2-ee (unregistering): left promiscuous mode
Jan 22 17:31:44 np0005592767 NetworkManager[54973]: <info>  [1769121104.9542] device (tap2b7f2db2-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:44Z|00330|binding|INFO|Releasing lport 2b7f2db2-eef8-44dc-8de4-375eff38c764 from this chassis (sb_readonly=0)
Jan 22 17:31:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:44Z|00331|binding|INFO|Setting lport 2b7f2db2-eef8-44dc-8de4-375eff38c764 down in Southbound
Jan 22 17:31:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:44Z|00332|binding|INFO|Removing iface tap2b7f2db2-ee ovn-installed in OVS
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.970 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:44 np0005592767 nova_compute[182623]: 2026-01-22 22:31:44.980 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:44.988 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:86:9e 10.100.0.4'], port_security=['fa:16:3e:b6:86:9e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-354683a7-3755-487f-b5f4-0a224cbf99c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '708eb5a130224bd188eae5ec27c67df5', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'be8d0109-9c88-4841-849c-b6fb2fa1422d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b07b63-d4ae-4176-b5c5-fc3af300441b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2b7f2db2-eef8-44dc-8de4-375eff38c764) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:44.990 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2b7f2db2-eef8-44dc-8de4-375eff38c764 in datapath 354683a7-3755-487f-b5f4-0a224cbf99c3 unbound from our chassis#033[00m
Jan 22 17:31:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:44.991 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 354683a7-3755-487f-b5f4-0a224cbf99c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:44.997 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f0584a1a-e739-452c-aba4-a3bb54778a7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.003 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 namespace which is not needed anymore#033[00m
Jan 22 17:31:45 np0005592767 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 22 17:31:45 np0005592767 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000056.scope: Consumed 12.684s CPU time.
Jan 22 17:31:45 np0005592767 systemd-machined[153912]: Machine qemu-41-instance-00000056 terminated.
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.170 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [NOTICE]   (223511) : haproxy version is 2.8.14-c23fe91
Jan 22 17:31:45 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [NOTICE]   (223511) : path to executable is /usr/sbin/haproxy
Jan 22 17:31:45 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [WARNING]  (223511) : Exiting Master process...
Jan 22 17:31:45 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [ALERT]    (223511) : Current worker (223514) exited with code 143 (Terminated)
Jan 22 17:31:45 np0005592767 neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3[223505]: [WARNING]  (223511) : All workers exited. Exiting... (0)
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.207 182627 INFO nova.virt.libvirt.driver [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Instance destroyed successfully.#033[00m
Jan 22 17:31:45 np0005592767 systemd[1]: libpod-9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269.scope: Deactivated successfully.
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.208 182627 DEBUG nova.objects.instance [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lazy-loading 'resources' on Instance uuid 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:45 np0005592767 podman[223700]: 2026-01-22 22:31:45.214715019 +0000 UTC m=+0.085237596 container died 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.223 182627 DEBUG nova.virt.libvirt.vif [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1655516205',display_name='tempest-ServerDiskConfigTestJSON-server-1655516205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1655516205',id=86,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='708eb5a130224bd188eae5ec27c67df5',ramdisk_id='',reservation_id='r-18l1vozy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-973240997',owner_user_name='tempest-ServerDiskConfigTestJSON-973240997-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:41Z,user_data=None,user_id='b08cde28781a46649c6528e52d00b1c1',uuid=9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.224 182627 DEBUG nova.network.os_vif_util [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converting VIF {"id": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "address": "fa:16:3e:b6:86:9e", "network": {"id": "354683a7-3755-487f-b5f4-0a224cbf99c3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1777175908-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "708eb5a130224bd188eae5ec27c67df5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7f2db2-ee", "ovs_interfaceid": "2b7f2db2-eef8-44dc-8de4-375eff38c764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.225 182627 DEBUG nova.network.os_vif_util [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.225 182627 DEBUG os_vif [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.227 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.228 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7f2db2-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.231 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.234 182627 INFO os_vif [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=2b7f2db2-eef8-44dc-8de4-375eff38c764,network=Network(354683a7-3755-487f-b5f4-0a224cbf99c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7f2db2-ee')#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.235 182627 INFO nova.virt.libvirt.driver [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Deleting instance files /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2_del#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.242 182627 INFO nova.virt.libvirt.driver [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Deletion of /var/lib/nova/instances/9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2_del complete#033[00m
Jan 22 17:31:45 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269-userdata-shm.mount: Deactivated successfully.
Jan 22 17:31:45 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8d1d6aed9403eb01baf0cd36a12e299f3a8542e761a5e22d86d47b71f6d21422-merged.mount: Deactivated successfully.
Jan 22 17:31:45 np0005592767 podman[223700]: 2026-01-22 22:31:45.262235656 +0000 UTC m=+0.132758223 container cleanup 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:31:45 np0005592767 systemd[1]: libpod-conmon-9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269.scope: Deactivated successfully.
Jan 22 17:31:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:45Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:99:5c 10.100.0.4
Jan 22 17:31:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:45Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:99:5c 10.100.0.4
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.321 182627 INFO nova.compute.manager [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.322 182627 DEBUG oslo.service.loopingcall [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.322 182627 DEBUG nova.compute.manager [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.323 182627 DEBUG nova.network.neutron [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:31:45 np0005592767 podman[223745]: 2026-01-22 22:31:45.343957951 +0000 UTC m=+0.054947987 container remove 9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.350 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[de73b53d-c897-45b0-b5ef-fd18c33a5312]: (4, ('Thu Jan 22 10:31:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269)\n9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269\nThu Jan 22 10:31:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 (9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269)\n9e0016f05108ceeb928fbc9276a3fd4cf0ea61a4b7f6450a4283490f683f7269\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.352 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[58c8ac5d-9549-4a2f-a14b-49e8cbfae7a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.353 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap354683a7-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.356 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 kernel: tap354683a7-30: left promiscuous mode
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.367 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.369 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.371 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8e1ae6-03a9-4008-8f53-fde64a168659]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.387 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a635a25d-3e50-48ed-88be-c462acad19a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.390 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[19d83b62-52a0-45b6-89b8-4001b08fda80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.411 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d1da8e-3e36-4bc6-a472-20ec9285d324]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468698, 'reachable_time': 16350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223758, 'error': None, 'target': 'ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 systemd[1]: run-netns-ovnmeta\x2d354683a7\x2d3755\x2d487f\x2db5f4\x2d0a224cbf99c3.mount: Deactivated successfully.
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.415 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-354683a7-3755-487f-b5f4-0a224cbf99c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:31:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:45.416 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[048be0ad-ed78-4412-af7d-a5d19c8977eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.967 182627 DEBUG nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.968 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.968 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.969 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.969 182627 DEBUG nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.969 182627 DEBUG nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-unplugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.970 182627 DEBUG nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.970 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.970 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.970 182627 DEBUG oslo_concurrency.lockutils [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.971 182627 DEBUG nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] No waiting events found dispatching network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:45 np0005592767 nova_compute[182623]: 2026-01-22 22:31:45.971 182627 WARNING nova.compute.manager [req-70d7b5dd-1d00-4905-aad2-bd3c37e42ca7 req-4e4c38c8-467c-4347-87ef-0d7d326820b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received unexpected event network-vif-plugged-2b7f2db2-eef8-44dc-8de4-375eff38c764 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.003 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.203 182627 DEBUG nova.network.neutron [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.224 182627 INFO nova.compute.manager [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Took 0.90 seconds to deallocate network for instance.#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.316 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.317 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.324 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.336 182627 DEBUG nova.compute.manager [req-5061ed68-b130-428a-bee7-aad6f068ea16 req-7384bc57-b3d3-4f04-82b4-8c7abf1f3bc2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Received event network-vif-deleted-2b7f2db2-eef8-44dc-8de4-375eff38c764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.373 182627 INFO nova.scheduler.client.report [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Deleted allocations for instance 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2#033[00m
Jan 22 17:31:46 np0005592767 nova_compute[182623]: 2026-01-22 22:31:46.451 182627 DEBUG oslo_concurrency.lockutils [None req-3c9a4611-a33c-4d0d-8569-67b8c7e87c71 b08cde28781a46649c6528e52d00b1c1 708eb5a130224bd188eae5ec27c67df5 - - default default] Lock "9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:48 np0005592767 podman[223760]: 2026-01-22 22:31:48.168047139 +0000 UTC m=+0.074321436 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 22 17:31:48 np0005592767 podman[223759]: 2026-01-22 22:31:48.179829433 +0000 UTC m=+0.094179589 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:31:50 np0005592767 nova_compute[182623]: 2026-01-22 22:31:50.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:51 np0005592767 nova_compute[182623]: 2026-01-22 22:31:51.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:53 np0005592767 podman[223824]: 2026-01-22 22:31:53.149146011 +0000 UTC m=+0.056602892 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:31:53 np0005592767 podman[223823]: 2026-01-22 22:31:53.161922083 +0000 UTC m=+0.064070874 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:31:53 np0005592767 nova_compute[182623]: 2026-01-22 22:31:53.435 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:31:55 np0005592767 nova_compute[182623]: 2026-01-22 22:31:55.233 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:55 np0005592767 kernel: tapc2644a02-28 (unregistering): left promiscuous mode
Jan 22 17:31:55 np0005592767 NetworkManager[54973]: <info>  [1769121115.6439] device (tapc2644a02-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:31:55 np0005592767 nova_compute[182623]: 2026-01-22 22:31:55.650 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:55Z|00333|binding|INFO|Releasing lport c2644a02-280b-410f-a2c6-37acfe5c15da from this chassis (sb_readonly=0)
Jan 22 17:31:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:55Z|00334|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da down in Southbound
Jan 22 17:31:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:55Z|00335|binding|INFO|Removing iface tapc2644a02-28 ovn-installed in OVS
Jan 22 17:31:55 np0005592767 nova_compute[182623]: 2026-01-22 22:31:55.652 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:55.659 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:29:6b 10.100.0.14'], port_security=['fa:16:3e:d3:29:6b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6168b412-0d9d-447a-9f39-23f5915a9dfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '731874d8-38e6-4c19-a4b9-1132d12c448d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89edec45-9e1e-4610-916c-77f10f88a664, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c2644a02-280b-410f-a2c6-37acfe5c15da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:55.660 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c2644a02-280b-410f-a2c6-37acfe5c15da in datapath 6168b412-0d9d-447a-9f39-23f5915a9dfa unbound from our chassis#033[00m
Jan 22 17:31:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:55.661 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6168b412-0d9d-447a-9f39-23f5915a9dfa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:31:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:55.662 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[efa5b933-1419-48e1-baf2-113f3278dec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:55 np0005592767 nova_compute[182623]: 2026-01-22 22:31:55.665 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:55 np0005592767 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000059.scope: Deactivated successfully.
Jan 22 17:31:55 np0005592767 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000059.scope: Consumed 12.930s CPU time.
Jan 22 17:31:55 np0005592767 systemd-machined[153912]: Machine qemu-43-instance-00000059 terminated.
Jan 22 17:31:55 np0005592767 nova_compute[182623]: 2026-01-22 22:31:55.925 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.448 182627 INFO nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.454 182627 INFO nova.virt.libvirt.driver [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance destroyed successfully.#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.455 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'numa_topology' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.472 182627 INFO nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Attempting rescue#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.473 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.478 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.478 182627 INFO nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Creating image(s)#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.479 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.479 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.480 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.481 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.511 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.512 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.525 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.588 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.589 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.634 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.rescue" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.635 182627 DEBUG oslo_concurrency.lockutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.636 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'migration_context' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.725 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.726 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start _get_guest_xml network_info=[{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "vif_mac": "fa:16:3e:d3:29:6b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.726 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'resources' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.913 182627 WARNING nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.926 182627 DEBUG nova.virt.libvirt.host [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.927 182627 DEBUG nova.virt.libvirt.host [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.930 182627 DEBUG nova.virt.libvirt.host [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.931 182627 DEBUG nova.virt.libvirt.host [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.932 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.932 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.933 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.933 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.933 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.933 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.933 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.934 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.934 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.934 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.934 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.934 182627 DEBUG nova.virt.hardware [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.935 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.951 182627 DEBUG nova.virt.libvirt.vif [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1902499833',display_name='tempest-ServerRescueTestJSONUnderV235-server-1902499833',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1902499833',id=89,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a2c265c40d2b4195b882f2503b5ebd3c',ramdisk_id='',reservation_id='r-112y35u5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1110728559',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1110728559-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:31:39Z,user_data=None,user_id='03f7ced9a7ee47849ffa16934d67478e',uuid=66bd6e4e-3db5-45d3-8495-bb100526e6a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "vif_mac": "fa:16:3e:d3:29:6b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.952 182627 DEBUG nova.network.os_vif_util [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converting VIF {"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "vif_mac": "fa:16:3e:d3:29:6b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.952 182627 DEBUG nova.network.os_vif_util [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.953 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.967 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <uuid>66bd6e4e-3db5-45d3-8495-bb100526e6a2</uuid>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <name>instance-00000059</name>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1902499833</nova:name>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:31:56</nova:creationTime>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:user uuid="03f7ced9a7ee47849ffa16934d67478e">tempest-ServerRescueTestJSONUnderV235-1110728559-project-member</nova:user>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:project uuid="a2c265c40d2b4195b882f2503b5ebd3c">tempest-ServerRescueTestJSONUnderV235-1110728559</nova:project>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        <nova:port uuid="c2644a02-280b-410f-a2c6-37acfe5c15da">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="serial">66bd6e4e-3db5-45d3-8495-bb100526e6a2</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="uuid">66bd6e4e-3db5-45d3-8495-bb100526e6a2</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.rescue"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <target dev="vdb" bus="virtio"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config.rescue"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:d3:29:6b"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <target dev="tapc2644a02-28"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/console.log" append="off"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:31:56 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:31:56 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:31:56 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:31:56 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:31:56 np0005592767 nova_compute[182623]: 2026-01-22 22:31:56.975 182627 INFO nova.virt.libvirt.driver [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance destroyed successfully.#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.049 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.050 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.050 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.050 182627 DEBUG nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] No VIF found with MAC fa:16:3e:d3:29:6b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.051 182627 INFO nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Using config drive#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.068 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'ec2_ids' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.102 182627 DEBUG nova.objects.instance [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'keypairs' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.367 182627 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.368 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.368 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.368 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.369 182627 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.369 182627 WARNING nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.369 182627 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.369 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.369 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.370 182627 DEBUG oslo_concurrency.lockutils [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.370 182627 DEBUG nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.370 182627 WARNING nova.compute.manager [req-2d93e00b-ae38-4ce3-b6e5-8decd15ea599 req-01ded21a-6b7b-49fa-b345-44d6d4f04e97 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.771 182627 INFO nova.virt.libvirt.driver [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Creating config drive at /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config.rescue#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.780 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphi7s9011 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.911 182627 DEBUG oslo_concurrency.processutils [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphi7s9011" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:31:57 np0005592767 kernel: tapc2644a02-28: entered promiscuous mode
Jan 22 17:31:57 np0005592767 systemd-udevd[223868]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:31:57 np0005592767 NetworkManager[54973]: <info>  [1769121117.9751] manager: (tapc2644a02-28): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Jan 22 17:31:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:57Z|00336|binding|INFO|Claiming lport c2644a02-280b-410f-a2c6-37acfe5c15da for this chassis.
Jan 22 17:31:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:57Z|00337|binding|INFO|c2644a02-280b-410f-a2c6-37acfe5c15da: Claiming fa:16:3e:d3:29:6b 10.100.0.14
Jan 22 17:31:57 np0005592767 nova_compute[182623]: 2026-01-22 22:31:57.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:57 np0005592767 NetworkManager[54973]: <info>  [1769121117.9859] device (tapc2644a02-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:31:57 np0005592767 NetworkManager[54973]: <info>  [1769121117.9875] device (tapc2644a02-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:31:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:57.997 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:29:6b 10.100.0.14'], port_security=['fa:16:3e:d3:29:6b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6168b412-0d9d-447a-9f39-23f5915a9dfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '731874d8-38e6-4c19-a4b9-1132d12c448d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89edec45-9e1e-4610-916c-77f10f88a664, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c2644a02-280b-410f-a2c6-37acfe5c15da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:57.999 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c2644a02-280b-410f-a2c6-37acfe5c15da in datapath 6168b412-0d9d-447a-9f39-23f5915a9dfa bound to our chassis#033[00m
Jan 22 17:31:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:58.000 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6168b412-0d9d-447a-9f39-23f5915a9dfa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:31:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:58Z|00338|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da ovn-installed in OVS
Jan 22 17:31:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:31:58Z|00339|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da up in Southbound
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.001 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:58.001 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc75c591-73eb-4881-ae5c-c64e6054880b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.006 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:58 np0005592767 systemd-machined[153912]: New machine qemu-44-instance-00000059.
Jan 22 17:31:58 np0005592767 systemd[1]: Started Virtual Machine qemu-44-instance-00000059.
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.481 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:31:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:58.481 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:31:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:31:58.483 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.965 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 66bd6e4e-3db5-45d3-8495-bb100526e6a2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.967 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121118.9652522, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.967 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:31:58 np0005592767 nova_compute[182623]: 2026-01-22 22:31:58.983 182627 DEBUG nova.compute.manager [None req-0b62e974-671e-4528-af07-cb611e19f699 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.014 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.017 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.046 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.047 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121118.9673023, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.047 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Started (Lifecycle Event)#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.068 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:31:59 np0005592767 nova_compute[182623]: 2026-01-22 22:31:59.071 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.084 182627 DEBUG nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.086 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.087 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.087 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.088 182627 DEBUG nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.088 182627 WARNING nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.089 182627 DEBUG nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.089 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.090 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.091 182627 DEBUG oslo_concurrency.lockutils [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.091 182627 DEBUG nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.092 182627 WARNING nova.compute.manager [req-9412609f-44fd-485c-9e54-0c415df5fbce req-f004aea0-7378-44a2-849d-88e9797442e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:32:00 np0005592767 podman[223930]: 2026-01-22 22:32:00.184707856 +0000 UTC m=+0.082832595 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.202 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121105.201849, 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.203 182627 INFO nova.compute.manager [-] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.221 182627 DEBUG nova.compute.manager [None req-ef40ff11-04b3-4082-8e21-bba9da710ae1 - - - - - -] [instance: 9ed55530-e2fb-4d5d-85ff-ab54cc48c8b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:00 np0005592767 nova_compute[182623]: 2026-01-22 22:32:00.235 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:01 np0005592767 nova_compute[182623]: 2026-01-22 22:32:01.093 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:02.485 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.237 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.586 182627 DEBUG nova.compute.manager [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.587 182627 DEBUG nova.compute.manager [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing instance network info cache due to event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.588 182627 DEBUG oslo_concurrency.lockutils [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.588 182627 DEBUG oslo_concurrency.lockutils [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.589 182627 DEBUG nova.network.neutron [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.815 182627 DEBUG nova.compute.manager [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.816 182627 DEBUG nova.compute.manager [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing instance network info cache due to event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:05 np0005592767 nova_compute[182623]: 2026-01-22 22:32:05.816 182627 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:06 np0005592767 nova_compute[182623]: 2026-01-22 22:32:06.095 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:07 np0005592767 nova_compute[182623]: 2026-01-22 22:32:07.211 182627 DEBUG nova.network.neutron [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated VIF entry in instance network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:07 np0005592767 nova_compute[182623]: 2026-01-22 22:32:07.212 182627 DEBUG nova.network.neutron [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:07 np0005592767 nova_compute[182623]: 2026-01-22 22:32:07.233 182627 DEBUG oslo_concurrency.lockutils [req-f5122c75-e7c1-4472-b0cd-0539a76da8f4 req-272d6920-2bf9-4420-966f-ccc23975a820 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:07 np0005592767 nova_compute[182623]: 2026-01-22 22:32:07.234 182627 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:07 np0005592767 nova_compute[182623]: 2026-01-22 22:32:07.235 182627 DEBUG nova.network.neutron [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.325 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'hostId': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.329 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000059', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'hostId': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.333 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 / tap4f4f1de0-a1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.334 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.336 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 66bd6e4e-3db5-45d3-8495-bb100526e6a2 / tapc2644a02-28 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.337 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6e89960-c76b-47a1-9815-b001be640d0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.330528', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f1961c0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'efbfb0718df12fef7f06cb9f3153a342c5f021befc858a61a4661e6097b3c409'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.330528', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f19d772-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '80d9abd41a8be62d38d503fa7291d735ec37bf232ea1a5ae55e0806bc14589f5'}]}, 'timestamp': '2026-01-22 22:32:07.338039', '_unique_id': '537cd43b8092418d9e10627a760c7fed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.360 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.362 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.385 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.387 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.387 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d100bb1-b1d9-4c56-aeeb-0741b0d7cb46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.343406', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f1d69c8-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': 'c1e9644c8d3b77ee645418b23a9534ecc745edb2983e70e2c8505d0160eadc06'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.343406', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f1db0fe-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': '3a9ba5d7ad32b530937804e45e1711ec3e7f26d4f3b6ed0d08b6042cec911d45'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.343406', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f215b0a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '74688030a5996fe0f72bf7fcfb244b67e7e6a69fd2ac7cccab283146a3776963'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.343406', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f216dfc-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '359902ff7dd2362f22c200f1208b3ca02bc73e5a0be02581636dee7471264585'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.343406', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f217a72-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '56dcb2c1e38f633831c5ac82500ab21a6aea6fd872650f5270982004cb1c77ff'}]}, 'timestamp': '2026-01-22 22:32:07.387969', '_unique_id': 'f1845c3557014e658be23eb9ec0c6e04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.390 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.390 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.390 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1547a1e6-ab9b-469c-bc7d-8f9b3efe30ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.390567', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f21ee44-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': '961808cf73ea8a55d27177b8498479503b8f446343956681d5548eade09f80d0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.390567', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f21f952-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '40e61eabeed3978002e1bb352f8270e44d69a8fb2b8c39058d8042d228dfd42a'}]}, 'timestamp': '2026-01-22 22:32:07.391176', '_unique_id': '5e9272f8b3c94110b714430b44ca154c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.393 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.424 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.latency volume: 265914876 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.425 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.latency volume: 33210674 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.507 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.latency volume: 177954030 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.508 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.508 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.latency volume: 559326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0102fe3a-43ce-4c04-af12-0068575af449', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 265914876, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.393866', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f27274c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '39e7964f8a8c16a3a70f96c83ee3b8b1ef5b86bf6d77d1cd9f6f3863cd1b51bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33210674, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.393866', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f2737dc-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '0626097bf59a5b357b82c7e9335a199070cda23a80898f8df8c515ce19623fd4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 177954030, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.393866', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f33def6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '0cf5b4a6ae287cb20f146d25aa3b0c6227cf506d9f2546f663a2f38f7513d5bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.393866', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f33e932-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '86eba3978ad9ed183db3e87b9dd72bc59aabce9d9db3798fe74860dac9baf46c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 559326, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.393866', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f33f12a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '28b6aff06b1c569312b594a375ab3ca2abd5114895b0a475c5d9d6c004d09164'}]}, 'timestamp': '2026-01-22 22:32:07.508928', '_unique_id': '760a7e089dca400c951a06bf772ed638'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.512 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.512 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>]
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.512 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.513 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.513 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.513 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.513 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e693d3b-a39b-4931-87ce-65d9621e1bd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.512749', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f349472-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '1ddd2d960e3282666567b3716d68b6f4576bbaad971d1efd1cc949266e797af9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.512749', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f34a214-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '057dd5d3cfece2ca4ba00ba9335af4f086e7461b0d86c65d371021ba25ae7afa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.512749', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f34aa34-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '873d8893032bf769af38675f605db57dbb459d60466b7602312cae58b87d2117'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.512749', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f34b196-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'e51d41a4feefef8df3fa9cded637cfbebf0200f444adba7a8903fb6190e17005'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.512749', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f34bace-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'ac3e8102393807ff6f98c3e8adf811ca915fff74414ba9d736d8ebac5c8ab531'}]}, 'timestamp': '2026-01-22 22:32:07.514120', '_unique_id': 'bf1d5f0e9f004042a1b8d77c347f1f41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.516 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.bytes volume: 30747136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.516 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.516 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.517 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.517 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c711477-2344-46cf-ab77-dbc59ee22475', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30747136, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.516081', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f35194c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '59b051ce2e230c0317e77fac97eed68d22ac69ddab211d62c12ee33fb783cbde'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.516081', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f35268a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': 'c92006660fb35d7ffca892773558905d6705f15892c9641a8eb3d935ccdae5dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.516081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f353102-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '91f4a6456251c6ff54d37d27d215f390c29a1617ebfff5fd5be99b10a3b0ea55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.516081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f3539d6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '7d25246b7ae811736c3af1ddae8318612032b02379a810c08416d19e1f88199f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.516081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f3541a6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'ae76b77ec186f7c3307d1e8bac5f7809d181352e8c4efc32380a3d3184829098'}]}, 'timestamp': '2026-01-22 22:32:07.517542', '_unique_id': 'd82fc035582a48f3a256bdbafc1a90f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.520 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.521 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.521 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>]
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.522 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.522 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.523 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9ad684c-3c0b-47e7-aa32-7cef3d940628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.522313', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f3612fc-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'aa8c7a75075d535b93e4ebba82eec2690225dbb205e0bf645892a1ccb76ce924'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.522313', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f362d8c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '41a865f4facde82b45918a0e501c61fce0c8cd59e6ea7c36ecb2b7d6af1a3cf3'}]}, 'timestamp': '2026-01-22 22:32:07.523721', '_unique_id': '5aded8eeed9a4987bd597366f746ba8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.527 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.528 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '616249f0-ceeb-4a60-9569-42d2b30410e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.527629', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f36da66-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'fe9b2f25c866b54729d4a30630d926497e37e761eec236ed13b74513c4e73503'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.527629', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f36e8c6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': 'f4deafcdb0d3b1d0dc71eccaa50176b7768f06f6593cb1b52912c076ab6c91b0'}]}, 'timestamp': '2026-01-22 22:32:07.528409', '_unique_id': '2c3481b6f6424d7b8045c209baf3e726'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.533 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.533 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09df79a0-797c-4e71-bea3-cdb657366f2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.533062', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f37ac0c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'c652b8bed1b6d50cfc914d1d997da1faf5eb07faf47cbd7e83713c66bcda42a3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.533062', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f37b828-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '7ef89b7f3177bec84ead613413683c81363905fa023a5e20edcc3b67495591f3'}]}, 'timestamp': '2026-01-22 22:32:07.533725', '_unique_id': '826cd3839a62493885e6b9aff16034f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.534 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.535 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.535 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>]
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.535 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df4d6707-16db-4a04-b8cd-7e396a13776e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.535850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f38176e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': '342853154e3af4471a91ac860fbccb238d5a35ee498fc01c143d939ea662ed4d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.535850', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f3822ea-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '0f70e5ed3295c9d9f4c28df0643b0c6b39f6301cb6037a85530c2178e61d1f76'}]}, 'timestamp': '2026-01-22 22:32:07.536416', '_unique_id': '112b79d995734d6993addb27abca31b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.536 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.538 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.538 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a89bb632-15ea-4ca5-abae-395937a510d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.538322', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f38797a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': '5bd183d26d965bc92cc742ab707ca5670900026bd75b1bba71ee34cd8fa28682'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.538322', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f3885a0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '871565aeccb23c7e30920864ef978f5757a0639c12b6f9394a49c809c5fcf3d0'}]}, 'timestamp': '2026-01-22 22:32:07.538977', '_unique_id': '92d17d67bac544b89a01fbe259dbc95b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.539 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.540 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.540 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.541 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.541 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.541 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.542 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b08fcd78-6999-4768-8b2e-02aacf0d746d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.540754', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f38d960-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': '10f639e662603dd680b7a9edf0f23f41cbb81e8d7ec59676669f713ad64a93ee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.540754', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f38e716-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': '67751b9cb1506a7a20658dd022f4b565fad5e72dc72793643e85a72beb031a6d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.540754', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f38f490-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '4a4c86042aad1c2bb3a2c55889d5eeb576335dfe0272fd2a3cdcab2ba18a64b7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.540754', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f3902be-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': 'f79a6d1acc6ededf08bc0a7e8ef3b8a174a1c768c49ad28a98109308fa33c789'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.540754', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f390f70-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '400997aee834198e12972f5464396b589fba030a7a05a1aefa6e8e8ff6cea07e'}]}, 'timestamp': '2026-01-22 22:32:07.542494', '_unique_id': 'ace48464b3c04eb49b78ecc4d41c4ff0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.543 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.544 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.544 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.545 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.545 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.545 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e854cd23-319c-4d3f-b8e5-cf812dd8cd4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.544497', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f396b14-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': '288b8c09b98d25c980fe0b2e80a09c4c701443fb876e3aa76f4a6a5b5b661709'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.544497', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f3976c2-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.978314108, 'message_signature': '2f0789efd531d09df3c8485dcde097561cfe7321441090741652b4c1edd2cdc8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.544497', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f3982c0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': 'cc23ab08d59a4a4a4c225700b83a2b417c36a46ff1c97af8fae59a866cf80c62'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.544497', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f398f2c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': 'cbd8452bd88a43fe4ce98d32b7261e6a24b1e071ebc8f8264123e0432d093cbd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.544497', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f399ae4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.998511279, 'message_signature': '451149f77d9b13e568085f6c3dfd0405bca0902e36224943f4ffa3cc6081fadc'}]}, 'timestamp': '2026-01-22 22:32:07.546087', '_unique_id': '407ba8c7d4774859b6e379c49540af35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.546 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.548 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.548 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd71f9a2f-afb1-4ded-9b0d-949c441f09b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.548203', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f39fc14-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': '96a7b4877aaf8e610897064f5b1ef2c33b1a24a3eca546dcfb9281f94d930474'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.548203', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f3a08c6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': 'eee05d844661a78f3bf71df69c0c14a7b24304a64aac4149422cbb2c886f1918'}]}, 'timestamp': '2026-01-22 22:32:07.548888', '_unique_id': '6e650c6e73314f96abea4757022e92a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.549 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.550 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.569 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/memory.usage volume: 42.5546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.590 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.591 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 66bd6e4e-3db5-45d3-8495-bb100526e6a2: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d897732-d9c4-4d48-bf08-4b47c1c29d34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.5546875, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'timestamp': '2026-01-22T22:32:07.550671', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2f3d3cf8-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.204087175, 'message_signature': '0b34c46a76c76af00ed927151222438563fb1f5a2b2eb55d66d1aa791a5e6f33'}]}, 'timestamp': '2026-01-22 22:32:07.591380', '_unique_id': 'b91313e505904515885307793c1844f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.594 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.594 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15b661c6-1ff9-49ea-af8a-c3a753a2161b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.594120', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f41007c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'b1ef4308d2ff07383f0d9887b907e5fa1d105a89605d37ca0b9f40cd9715bfff'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.594120', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f41108a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': 'fe8bfd1102223cd9436ec1b49aed7ce8ed7ab83d8824259e9be2ef4c66a6ad82'}]}, 'timestamp': '2026-01-22 22:32:07.594976', '_unique_id': 'cda5f704cc1643db9e1dfd608b2a060c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.597 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.requests volume: 1108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.597 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.598 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.598 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.598 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baf21b92-131d-4697-b3ce-cd3e081467aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1108, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.597270', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f4179bc-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '7942ad907ac86a9fe40d028bede11e57c2df4e92adb89c8e4963acc73a17eafc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.597270', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f41877c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '7a2059b486ad771cb71ebca840bd863347856f5b6718da282db069ce4978703b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.597270', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f419730-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'bd6d28df9d9512b979db6d83ee5f6cabd86e6ecd7ed4a8e800712f05bd0c0fc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.597270', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f41a4a0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'a86d644d6913c84f7f4348e23e37564492bb34679f3a9637a64c03848dc5424b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.597270', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f41b198-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'e71fa75dfc1378e32b5aff88f0597be219bd346093b1b4a235bb9787636bf3ac'}]}, 'timestamp': '2026-01-22 22:32:07.599085', '_unique_id': 'be1f52656f704961b6ecc8f92fd62672'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.599 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.601 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.bytes volume: 72859648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.601 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.602 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.602 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.602 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33018b4f-b8b2-44ab-863f-ed016c73ff7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72859648, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.601370', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f42193a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '5dcc91f09e59426a35f0c9955faa1efdbc0f6c5883cf73042b61c3d09ca9700d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.601370', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f422768-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '2c5c300b350e6c273aec75fec31402b3e927d59883ec92dee6c6a266a5eeeb30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.601370', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f4235a0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'eca397b88bf730172658bbb5e94e15c7a1d7d2b2a166da893222efc26bf3e002'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.601370', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f42428e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '38ea2f40943983444a0596672ff8df1048c8e8c70ce6edf586bba34414019cec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.601370', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f424f2c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '306a125d36cbabf2cf37682ed87ce84dbea41c32488d49357bd9275cb342b9f0'}]}, 'timestamp': '2026-01-22 22:32:07.603112', '_unique_id': 'ec5be9e60b66479fae8fee7848a68e68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.603 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.605 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.latency volume: 3172039397 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.605 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.606 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.606 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.606 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09e747f8-fc95-4db9-aad5-5499fc399e96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3172039397, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-vda', 'timestamp': '2026-01-22T22:32:07.605350', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f42b4da-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': 'acc4231d512ffea6af02516167b2af3512e1d67e1cecc325d477d8cf9859f06d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-sda', 'timestamp': '2026-01-22T22:32:07.605350', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f42c312-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.028828177, 'message_signature': '76844f92d3d0a2669b9a3462a906ae501fda8ef62d4f2eb99a7ec4ee8a5a3c07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vda', 'timestamp': '2026-01-22T22:32:07.605350', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2f42d12c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'cd789a07a24e06b739bf7c650c0ff947f36914b2feacaf80d021d8e472160f0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-vdb', 'timestamp': '2026-01-22T22:32:07.605350', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '2f42de4c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': 'cb9166dc552a0e55cfc6e90ca4ef8b1de45cbbf5a2f6a6501ba50002d76b653d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2-sda', 'timestamp': '2026-01-22T22:32:07.605350', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2f42ec8e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.060415861, 'message_signature': '19d25058008841a62d5f0ef8bd494d791a221845f28c81b93b37bd7c7c29bb5d'}]}, 'timestamp': '2026-01-22 22:32:07.607220', '_unique_id': '2fac5b8b85e6449c826350601afefa1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.607 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.609 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.610 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1126799150>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1902499833>]
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.610 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.610 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdd28410-45f5-4c7d-8a67-d405aef2b6c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'instance-0000005a-aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-tap4f4f1de0-a1', 'timestamp': '2026-01-22T22:32:07.610451', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'tap4f4f1de0-a1', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:99:5c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4f4f1de0-a1'}, 'message_id': '2f437c30-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.965413523, 'message_signature': 'cca3eccea6237f625a42e960a361af80ac1617ca07e4dbc4626be257dee90e4e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': 'instance-00000059-66bd6e4e-3db5-45d3-8495-bb100526e6a2-tapc2644a02-28', 'timestamp': '2026-01-22T22:32:07.610451', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'tapc2644a02-28', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:29:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2644a02-28'}, 'message_id': '2f438b6c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4723.969969682, 'message_signature': '9f3fc9c38bce7227442fc333fd6500374f99a76eeb9c0ae4506d03debc80d30e'}]}, 'timestamp': '2026-01-22 22:32:07.611251', '_unique_id': 'e5f466a9020c47089640388d8af4ea9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.611 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.613 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.613 12 DEBUG ceilometer.compute.pollsters [-] aa33ef57-9092-4a0a-bf8f-fd0041ab60e7/cpu volume: 11310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.613 12 DEBUG ceilometer.compute.pollsters [-] 66bd6e4e-3db5-45d3-8495-bb100526e6a2/cpu volume: 8210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a95fa55c-fdc3-446f-97e7-bf79c0d7ed9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11310000000, 'user_id': 'b6f50d0e6a7444f0ac9c928363915afb', 'user_name': None, 'project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'project_name': None, 'resource_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'timestamp': '2026-01-22T22:32:07.613464', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1126799150', 'name': 'instance-0000005a', 'instance_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'instance_type': 'm1.nano', 'host': 'd978bc4555804666fe4ad98ef23200ff3c5757a9a63e61adeb2ee118', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8bcaf91e-26cd-4687-9abd-8185bd0c5241'}, 'image_ref': '8bcaf91e-26cd-4687-9abd-8185bd0c5241', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2f43f368-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.204087175, 'message_signature': 'aff54af83194f9524b02bb54bf12c3e6c5364bd610385c464b3b6b9e97e537cb'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8210000000, 'user_id': '03f7ced9a7ee47849ffa16934d67478e', 'user_name': None, 'project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'project_name': None, 'resource_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'timestamp': '2026-01-22T22:32:07.613464', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1902499833', 'name': 'instance-00000059', 'instance_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'instance_type': 'm1.nano', 'host': '6f596a0dbe8f9b8d22c03ab8b6e1c3c2d288cf47169553b6a9c9599a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2f44027c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4724.225460449, 'message_signature': 'ecb6cc3d4a39be3e68be49ddd32da1dc8850885afd829690096ed59a466438fe'}]}, 'timestamp': '2026-01-22 22:32:07.614286', '_unique_id': 'f0baba56d17e4f6fb77be5e6b0d7c852'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:32:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:32:07.614 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.087 182627 DEBUG nova.network.neutron [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated VIF entry in instance network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.088 182627 DEBUG nova.network.neutron [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.105 182627 DEBUG oslo_concurrency.lockutils [req-79650bf7-99ac-42a7-9396-95a16f2caf68 req-5880de83-dc9d-44bb-9a27-6e5d632ef83a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:09 np0005592767 NetworkManager[54973]: <info>  [1769121129.2514] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 22 17:32:09 np0005592767 NetworkManager[54973]: <info>  [1769121129.2525] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.259 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.313 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:09Z|00340|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=0)
Jan 22 17:32:09 np0005592767 nova_compute[182623]: 2026-01-22 22:32:09.325 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.240 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.291 182627 DEBUG nova.compute.manager [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.291 182627 DEBUG nova.compute.manager [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing instance network info cache due to event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.292 182627 DEBUG oslo_concurrency.lockutils [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.292 182627 DEBUG oslo_concurrency.lockutils [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:10 np0005592767 nova_compute[182623]: 2026-01-22 22:32:10.292 182627 DEBUG nova.network.neutron [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:11 np0005592767 nova_compute[182623]: 2026-01-22 22:32:11.138 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:12.105 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:12.109 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:12.111 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:12 np0005592767 podman[223966]: 2026-01-22 22:32:12.176151729 +0000 UTC m=+0.087907308 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:32:12 np0005592767 nova_compute[182623]: 2026-01-22 22:32:12.358 182627 DEBUG nova.network.neutron [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated VIF entry in instance network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:12 np0005592767 nova_compute[182623]: 2026-01-22 22:32:12.359 182627 DEBUG nova.network.neutron [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:12 np0005592767 nova_compute[182623]: 2026-01-22 22:32:12.397 182627 DEBUG oslo_concurrency.lockutils [req-7f45c16f-5ebd-4212-9034-00057b160a8a req-a36429e4-b053-4baf-81dc-dcc662e6d376 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:13Z|00341|binding|INFO|Releasing lport 0a1fd4a8-b506-4c9d-9846-1c0ab542e465 from this chassis (sb_readonly=0)
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.921 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.922 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.922 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.922 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.922 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.935 182627 INFO nova.compute.manager [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Terminating instance#033[00m
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.946 182627 DEBUG nova.compute.manager [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:32:13 np0005592767 kernel: tap4f4f1de0-a1 (unregistering): left promiscuous mode
Jan 22 17:32:13 np0005592767 NetworkManager[54973]: <info>  [1769121133.9753] device (tap4f4f1de0-a1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.983 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:13Z|00342|binding|INFO|Releasing lport 4f4f1de0-a1fa-42ab-98de-698c12368baf from this chassis (sb_readonly=0)
Jan 22 17:32:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:13Z|00343|binding|INFO|Setting lport 4f4f1de0-a1fa-42ab-98de-698c12368baf down in Southbound
Jan 22 17:32:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:13Z|00344|binding|INFO|Removing iface tap4f4f1de0-a1 ovn-installed in OVS
Jan 22 17:32:13 np0005592767 nova_compute[182623]: 2026-01-22 22:32:13.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:13.996 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:99:5c 10.100.0.4'], port_security=['fa:16:3e:ee:99:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aa33ef57-9092-4a0a-bf8f-fd0041ab60e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f234f62b-5371-4527-94e7-91cf5da3055e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '802c49a328ca49e3a4ea4e46b9a9f5eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3b57348-3994-471b-bd73-e78507392f5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c3e5cc-ee0d-48e7-8eab-3e968c7ed6fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4f4f1de0-a1fa-42ab-98de-698c12368baf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.000 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4f4f1de0-a1fa-42ab-98de-698c12368baf in datapath f234f62b-5371-4527-94e7-91cf5da3055e unbound from our chassis#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.005 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f234f62b-5371-4527-94e7-91cf5da3055e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.008 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61d9487d-8f7a-4be1-b88e-e77e79076740]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.009 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e namespace which is not needed anymore#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:14 np0005592767 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 22 17:32:14 np0005592767 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Consumed 13.975s CPU time.
Jan 22 17:32:14 np0005592767 systemd-machined[153912]: Machine qemu-42-instance-0000005a terminated.
Jan 22 17:32:14 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [NOTICE]   (223617) : haproxy version is 2.8.14-c23fe91
Jan 22 17:32:14 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [NOTICE]   (223617) : path to executable is /usr/sbin/haproxy
Jan 22 17:32:14 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [WARNING]  (223617) : Exiting Master process...
Jan 22 17:32:14 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [ALERT]    (223617) : Current worker (223623) exited with code 143 (Terminated)
Jan 22 17:32:14 np0005592767 neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e[223613]: [WARNING]  (223617) : All workers exited. Exiting... (0)
Jan 22 17:32:14 np0005592767 systemd[1]: libpod-3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f.scope: Deactivated successfully.
Jan 22 17:32:14 np0005592767 podman[224009]: 2026-01-22 22:32:14.173281871 +0000 UTC m=+0.046476286 container died 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:32:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:32:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay-4f9da1943a40f2d669f980b742c92b3cd56e4760d971e710d8fe2af8abf063e2-merged.mount: Deactivated successfully.
Jan 22 17:32:14 np0005592767 podman[224009]: 2026-01-22 22:32:14.219992383 +0000 UTC m=+0.093186798 container cleanup 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.219 182627 INFO nova.virt.libvirt.driver [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Instance destroyed successfully.#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.220 182627 DEBUG nova.objects.instance [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lazy-loading 'resources' on Instance uuid aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:14 np0005592767 systemd[1]: libpod-conmon-3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f.scope: Deactivated successfully.
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.249 182627 DEBUG nova.virt.libvirt.vif [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1126799150',display_name='tempest-ListServerFiltersTestJSON-instance-1126799150',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1126799150',id=90,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='802c49a328ca49e3a4ea4e46b9a9f5eb',ramdisk_id='',reservation_id='r-45ry5zcn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1169398826',owner_user_name='tempest-ListServerFiltersTestJSON-1169398826-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:33Z,user_data=None,user_id='b6f50d0e6a7444f0ac9c928363915afb',uuid=aa33ef57-9092-4a0a-bf8f-fd0041ab60e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.250 182627 DEBUG nova.network.os_vif_util [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converting VIF {"id": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "address": "fa:16:3e:ee:99:5c", "network": {"id": "f234f62b-5371-4527-94e7-91cf5da3055e", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-512610996-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "802c49a328ca49e3a4ea4e46b9a9f5eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f4f1de0-a1", "ovs_interfaceid": "4f4f1de0-a1fa-42ab-98de-698c12368baf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.250 182627 DEBUG nova.network.os_vif_util [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.251 182627 DEBUG os_vif [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.253 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.253 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f4f1de0-a1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.257 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.259 182627 INFO os_vif [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:99:5c,bridge_name='br-int',has_traffic_filtering=True,id=4f4f1de0-a1fa-42ab-98de-698c12368baf,network=Network(f234f62b-5371-4527-94e7-91cf5da3055e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f4f1de0-a1')#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.260 182627 INFO nova.virt.libvirt.driver [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Deleting instance files /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7_del#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.260 182627 INFO nova.virt.libvirt.driver [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Deletion of /var/lib/nova/instances/aa33ef57-9092-4a0a-bf8f-fd0041ab60e7_del complete#033[00m
Jan 22 17:32:14 np0005592767 podman[224057]: 2026-01-22 22:32:14.285363482 +0000 UTC m=+0.044073488 container remove 3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.293 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[26dcc199-8c88-4b58-b24d-03926c6e7ccf]: (4, ('Thu Jan 22 10:32:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f)\n3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f\nThu Jan 22 10:32:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e (3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f)\n3a286179d2df3b6be64bcf4d04229bfee998d827e4978efe430fa79f5891703f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.295 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f40bc5df-e271-4e5e-abae-6868ba992a34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.296 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf234f62b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.297 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:14 np0005592767 kernel: tapf234f62b-50: left promiscuous mode
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.308 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.312 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3592cd9e-7a58-453a-90da-3dfd307fa74c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.332 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[88f1db4c-34d4-4175-982c-3e3b26474a26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.334 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[79bbcce9-ac17-4e55-bc72-ccc4c14a87a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.356 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2f4975-3cac-4b62-b977-5fb54e61c3a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468808, 'reachable_time': 39589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224072, 'error': None, 'target': 'ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 systemd[1]: run-netns-ovnmeta\x2df234f62b\x2d5371\x2d4527\x2d94e7\x2d91cf5da3055e.mount: Deactivated successfully.
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.361 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f234f62b-5371-4527-94e7-91cf5da3055e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:32:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:14.361 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[5c10d4f2-6545-476c-8f23-b5543c4c2c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.368 182627 DEBUG nova.compute.manager [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-unplugged-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.368 182627 DEBUG oslo_concurrency.lockutils [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.368 182627 DEBUG oslo_concurrency.lockutils [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.369 182627 DEBUG oslo_concurrency.lockutils [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.369 182627 DEBUG nova.compute.manager [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] No waiting events found dispatching network-vif-unplugged-4f4f1de0-a1fa-42ab-98de-698c12368baf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.369 182627 DEBUG nova.compute.manager [req-df985c07-637c-4a82-97ec-030eff4cc6c7 req-b5d35029-6fb9-440a-9481-ebf7fe7e60ad 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-unplugged-4f4f1de0-a1fa-42ab-98de-698c12368baf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.381 182627 INFO nova.compute.manager [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.381 182627 DEBUG oslo.service.loopingcall [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.382 182627 DEBUG nova.compute.manager [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:32:14 np0005592767 nova_compute[182623]: 2026-01-22 22:32:14.382 182627 DEBUG nova.network.neutron [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.075 182627 DEBUG nova.compute.manager [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.075 182627 DEBUG nova.compute.manager [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing instance network info cache due to event network-changed-c2644a02-280b-410f-a2c6-37acfe5c15da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.076 182627 DEBUG oslo_concurrency.lockutils [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.076 182627 DEBUG oslo_concurrency.lockutils [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.076 182627 DEBUG nova.network.neutron [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Refreshing network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:15 np0005592767 nova_compute[182623]: 2026-01-22 22:32:15.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.139 182627 DEBUG nova.network.neutron [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.142 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.209 182627 INFO nova.compute.manager [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Took 1.83 seconds to deallocate network for instance.#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.319 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.320 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.404 182627 DEBUG nova.compute.provider_tree [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.431 182627 DEBUG nova.scheduler.client.report [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.451 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.507 182627 INFO nova.scheduler.client.report [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Deleted allocations for instance aa33ef57-9092-4a0a-bf8f-fd0041ab60e7#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.597 182627 DEBUG oslo_concurrency.lockutils [None req-9cfc1642-4857-403c-bdc2-d4de3e6d5d1c b6f50d0e6a7444f0ac9c928363915afb 802c49a328ca49e3a4ea4e46b9a9f5eb - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.632 182627 DEBUG nova.compute.manager [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.632 182627 DEBUG oslo_concurrency.lockutils [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.632 182627 DEBUG oslo_concurrency.lockutils [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.633 182627 DEBUG oslo_concurrency.lockutils [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "aa33ef57-9092-4a0a-bf8f-fd0041ab60e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.633 182627 DEBUG nova.compute.manager [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] No waiting events found dispatching network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.633 182627 WARNING nova.compute.manager [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received unexpected event network-vif-plugged-4f4f1de0-a1fa-42ab-98de-698c12368baf for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:32:16 np0005592767 nova_compute[182623]: 2026-01-22 22:32:16.633 182627 DEBUG nova.compute.manager [req-190ec954-8b30-472e-8059-1ddcf9817b64 req-affb069b-06bf-48f2-b017-11f0d408b479 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Received event network-vif-deleted-4f4f1de0-a1fa-42ab-98de-698c12368baf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:17 np0005592767 nova_compute[182623]: 2026-01-22 22:32:17.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:18 np0005592767 nova_compute[182623]: 2026-01-22 22:32:18.444 182627 DEBUG nova.network.neutron [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated VIF entry in instance network info cache for port c2644a02-280b-410f-a2c6-37acfe5c15da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:18 np0005592767 nova_compute[182623]: 2026-01-22 22:32:18.445 182627 DEBUG nova.network.neutron [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:18 np0005592767 nova_compute[182623]: 2026-01-22 22:32:18.480 182627 DEBUG oslo_concurrency.lockutils [req-14d7bb42-f24c-418c-bd91-6d4c25653f1a req-58077d1e-8b56-44ca-9e0b-abcaad3573c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:18 np0005592767 nova_compute[182623]: 2026-01-22 22:32:18.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:19 np0005592767 podman[224073]: 2026-01-22 22:32:19.180895364 +0000 UTC m=+0.103444438 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:32:19 np0005592767 podman[224074]: 2026-01-22 22:32:19.181996785 +0000 UTC m=+0.075963690 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:32:19 np0005592767 nova_compute[182623]: 2026-01-22 22:32:19.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:19 np0005592767 nova_compute[182623]: 2026-01-22 22:32:19.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:19 np0005592767 nova_compute[182623]: 2026-01-22 22:32:19.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:32:20 np0005592767 nova_compute[182623]: 2026-01-22 22:32:20.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:20 np0005592767 nova_compute[182623]: 2026-01-22 22:32:20.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:20 np0005592767 nova_compute[182623]: 2026-01-22 22:32:20.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:32:20 np0005592767 nova_compute[182623]: 2026-01-22 22:32:20.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.141 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.142 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.142 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.142 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.185 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.298 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.421 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.496 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.496 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.496 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.497 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.497 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.511 182627 INFO nova.compute.manager [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Terminating instance#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.524 182627 DEBUG nova.compute.manager [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:32:21 np0005592767 kernel: tapc2644a02-28 (unregistering): left promiscuous mode
Jan 22 17:32:21 np0005592767 NetworkManager[54973]: <info>  [1769121141.5492] device (tapc2644a02-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:21Z|00345|binding|INFO|Releasing lport c2644a02-280b-410f-a2c6-37acfe5c15da from this chassis (sb_readonly=0)
Jan 22 17:32:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:21Z|00346|binding|INFO|Setting lport c2644a02-280b-410f-a2c6-37acfe5c15da down in Southbound
Jan 22 17:32:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:21Z|00347|binding|INFO|Removing iface tapc2644a02-28 ovn-installed in OVS
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:21.569 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:29:6b 10.100.0.14'], port_security=['fa:16:3e:d3:29:6b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '66bd6e4e-3db5-45d3-8495-bb100526e6a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6168b412-0d9d-447a-9f39-23f5915a9dfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2c265c40d2b4195b882f2503b5ebd3c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '731874d8-38e6-4c19-a4b9-1132d12c448d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89edec45-9e1e-4610-916c-77f10f88a664, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c2644a02-280b-410f-a2c6-37acfe5c15da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:21.571 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c2644a02-280b-410f-a2c6-37acfe5c15da in datapath 6168b412-0d9d-447a-9f39-23f5915a9dfa unbound from our chassis#033[00m
Jan 22 17:32:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:21.573 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6168b412-0d9d-447a-9f39-23f5915a9dfa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.575 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:21.575 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0b4db839-dc5b-45f2-b734-c43c2de0793d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:21 np0005592767 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000059.scope: Deactivated successfully.
Jan 22 17:32:21 np0005592767 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000059.scope: Consumed 13.529s CPU time.
Jan 22 17:32:21 np0005592767 systemd-machined[153912]: Machine qemu-44-instance-00000059 terminated.
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.826 182627 INFO nova.virt.libvirt.driver [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Instance destroyed successfully.#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.826 182627 DEBUG nova.objects.instance [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lazy-loading 'resources' on Instance uuid 66bd6e4e-3db5-45d3-8495-bb100526e6a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.850 182627 DEBUG nova.virt.libvirt.vif [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:31:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1902499833',display_name='tempest-ServerRescueTestJSONUnderV235-server-1902499833',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1902499833',id=89,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:31:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a2c265c40d2b4195b882f2503b5ebd3c',ramdisk_id='',reservation_id='r-112y35u5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1110728559',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1110728559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:31:59Z,user_data=None,user_id='03f7ced9a7ee47849ffa16934d67478e',uuid=66bd6e4e-3db5-45d3-8495-bb100526e6a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.850 182627 DEBUG nova.network.os_vif_util [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converting VIF {"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.852 182627 DEBUG nova.network.os_vif_util [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.852 182627 DEBUG os_vif [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.854 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.854 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2644a02-28, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.856 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.858 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.858 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.860 182627 INFO os_vif [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:29:6b,bridge_name='br-int',has_traffic_filtering=True,id=c2644a02-280b-410f-a2c6-37acfe5c15da,network=Network(6168b412-0d9d-447a-9f39-23f5915a9dfa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2644a02-28')#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.860 182627 INFO nova.virt.libvirt.driver [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Deleting instance files /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2_del#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.861 182627 INFO nova.virt.libvirt.driver [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Deletion of /var/lib/nova/instances/66bd6e4e-3db5-45d3-8495-bb100526e6a2_del complete#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.941 182627 INFO nova.compute.manager [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.942 182627 DEBUG oslo.service.loopingcall [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.942 182627 DEBUG nova.compute.manager [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:32:21 np0005592767 nova_compute[182623]: 2026-01-22 22:32:21.942 182627 DEBUG nova.network.neutron [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.319 182627 DEBUG nova.compute.manager [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.319 182627 DEBUG oslo_concurrency.lockutils [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.320 182627 DEBUG oslo_concurrency.lockutils [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.320 182627 DEBUG oslo_concurrency.lockutils [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.321 182627 DEBUG nova.compute.manager [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:22 np0005592767 nova_compute[182623]: 2026-01-22 22:32:22.321 182627 DEBUG nova.compute.manager [req-cd1e64d6-374a-4e32-b550-d5d031a255f8 req-dd1b03ae-ecbc-4063-915b-cad1f60658db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-unplugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.559 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [{"id": "c2644a02-280b-410f-a2c6-37acfe5c15da", "address": "fa:16:3e:d3:29:6b", "network": {"id": "6168b412-0d9d-447a-9f39-23f5915a9dfa", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1258169983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a2c265c40d2b4195b882f2503b5ebd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2644a02-28", "ovs_interfaceid": "c2644a02-280b-410f-a2c6-37acfe5c15da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.587 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-66bd6e4e-3db5-45d3-8495-bb100526e6a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.587 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.588 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.613 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.613 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.614 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.614 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.674 182627 DEBUG nova.network.neutron [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.691 182627 INFO nova.compute.manager [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Took 1.75 seconds to deallocate network for instance.#033[00m
Jan 22 17:32:23 np0005592767 podman[224150]: 2026-01-22 22:32:23.718066195 +0000 UTC m=+0.061725038 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:32:23 np0005592767 podman[224151]: 2026-01-22 22:32:23.732161034 +0000 UTC m=+0.067081319 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.784 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.784 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.820 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.821 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.23172378540039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.821 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.849 182627 DEBUG nova.compute.provider_tree [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.853 182627 DEBUG nova.compute.manager [req-ed762007-59f9-4623-9a6d-b9149df27382 req-811ad1a5-9e32-445a-9025-f3dca0971be6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-deleted-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.863 182627 DEBUG nova.scheduler.client.report [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.884 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.886 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.916 182627 INFO nova.scheduler.client.report [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Deleted allocations for instance 66bd6e4e-3db5-45d3-8495-bb100526e6a2#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.977 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:32:23 np0005592767 nova_compute[182623]: 2026-01-22 22:32:23.977 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.001 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.034 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.050 182627 DEBUG oslo_concurrency.lockutils [None req-13a19adb-e323-4550-abb4-8eccde1a29fa 03f7ced9a7ee47849ffa16934d67478e a2c265c40d2b4195b882f2503b5ebd3c - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.058 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.058 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.368 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.544 182627 DEBUG nova.compute.manager [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.545 182627 DEBUG oslo_concurrency.lockutils [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.546 182627 DEBUG oslo_concurrency.lockutils [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.546 182627 DEBUG oslo_concurrency.lockutils [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "66bd6e4e-3db5-45d3-8495-bb100526e6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.547 182627 DEBUG nova.compute.manager [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] No waiting events found dispatching network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:24 np0005592767 nova_compute[182623]: 2026-01-22 22:32:24.548 182627 WARNING nova.compute.manager [req-a997cf2d-347a-4253-b5ca-5ac823d10ef7 req-c2e66600-235c-4f23-8eb4-5fad2555a8ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Received unexpected event network-vif-plugged-c2644a02-280b-410f-a2c6-37acfe5c15da for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:32:26 np0005592767 nova_compute[182623]: 2026-01-22 22:32:26.184 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:26 np0005592767 nova_compute[182623]: 2026-01-22 22:32:26.857 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:28 np0005592767 nova_compute[182623]: 2026-01-22 22:32:28.668 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:29 np0005592767 nova_compute[182623]: 2026-01-22 22:32:29.218 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121134.2166345, aa33ef57-9092-4a0a-bf8f-fd0041ab60e7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:29 np0005592767 nova_compute[182623]: 2026-01-22 22:32:29.218 182627 INFO nova.compute.manager [-] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:32:29 np0005592767 nova_compute[182623]: 2026-01-22 22:32:29.239 182627 DEBUG nova.compute.manager [None req-18a5a5fd-f0cf-4bd5-8a36-54da83cf433c - - - - - -] [instance: aa33ef57-9092-4a0a-bf8f-fd0041ab60e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:31 np0005592767 podman[224191]: 2026-01-22 22:32:31.18130517 +0000 UTC m=+0.093717853 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:32:31 np0005592767 nova_compute[182623]: 2026-01-22 22:32:31.186 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:31 np0005592767 nova_compute[182623]: 2026-01-22 22:32:31.860 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.345 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.346 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.388 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.549 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.550 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.557 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.558 182627 INFO nova.compute.claims [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.703 182627 DEBUG nova.compute.provider_tree [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.726 182627 DEBUG nova.scheduler.client.report [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.764 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.764 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.830 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.830 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.858 182627 INFO nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:32:32 np0005592767 nova_compute[182623]: 2026-01-22 22:32:32.887 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.030 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.031 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.032 182627 INFO nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Creating image(s)#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.032 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.033 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.034 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.050 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.112 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.114 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.114 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.131 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.189 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.190 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.211 182627 DEBUG nova.policy [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3738d2d62baa4adc84f010ecf9eda9ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '01c8405dbfef4380888a9355710f3976', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.226 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.227 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.228 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.298 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.300 182627 DEBUG nova.virt.disk.api [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Checking if we can resize image /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.301 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.357 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.359 182627 DEBUG nova.virt.disk.api [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Cannot resize image /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.359 182627 DEBUG nova.objects.instance [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'migration_context' on Instance uuid 4083151b-cb74-4902-b4e9-64b23a3403d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.379 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.379 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Ensure instance console log exists: /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.380 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.380 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.381 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:33 np0005592767 nova_compute[182623]: 2026-01-22 22:32:33.948 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Successfully created port: 51272942-4bc8-464f-9432-7d95923a10c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.279 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Successfully updated port: 51272942-4bc8-464f-9432-7d95923a10c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.302 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.303 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquired lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.303 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.418 182627 DEBUG nova.compute.manager [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-changed-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.419 182627 DEBUG nova.compute.manager [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Refreshing instance network info cache due to event network-changed-51272942-4bc8-464f-9432-7d95923a10c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.420 182627 DEBUG oslo_concurrency.lockutils [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:35 np0005592767 nova_compute[182623]: 2026-01-22 22:32:35.514 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.216 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.456 182627 DEBUG nova.network.neutron [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Updating instance_info_cache with network_info: [{"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.486 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Releasing lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.487 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Instance network_info: |[{"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.488 182627 DEBUG oslo_concurrency.lockutils [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.488 182627 DEBUG nova.network.neutron [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Refreshing network info cache for port 51272942-4bc8-464f-9432-7d95923a10c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.491 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Start _get_guest_xml network_info=[{"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.497 182627 WARNING nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.503 182627 DEBUG nova.virt.libvirt.host [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.504 182627 DEBUG nova.virt.libvirt.host [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.510 182627 DEBUG nova.virt.libvirt.host [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.511 182627 DEBUG nova.virt.libvirt.host [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.512 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.513 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.513 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.514 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.514 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.514 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.515 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.515 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.515 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.516 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.516 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.516 182627 DEBUG nova.virt.hardware [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.521 182627 DEBUG nova.virt.libvirt.vif [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-348209217',display_name='tempest-tempest.common.compute-instance-348209217-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-348209217-2',id=94,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-936h1j46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:32Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=4083151b-cb74-4902-b4e9-64b23a3403d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.522 182627 DEBUG nova.network.os_vif_util [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.523 182627 DEBUG nova.network.os_vif_util [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.524 182627 DEBUG nova.objects.instance [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4083151b-cb74-4902-b4e9-64b23a3403d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.537 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <uuid>4083151b-cb74-4902-b4e9-64b23a3403d8</uuid>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <name>instance-0000005e</name>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:name>tempest-tempest.common.compute-instance-348209217-2</nova:name>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:32:36</nova:creationTime>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:user uuid="3738d2d62baa4adc84f010ecf9eda9ec">tempest-MultipleCreateTestJSON-1123278408-project-member</nova:user>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:project uuid="01c8405dbfef4380888a9355710f3976">tempest-MultipleCreateTestJSON-1123278408</nova:project>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        <nova:port uuid="51272942-4bc8-464f-9432-7d95923a10c5">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="serial">4083151b-cb74-4902-b4e9-64b23a3403d8</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="uuid">4083151b-cb74-4902-b4e9-64b23a3403d8</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.config"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:ee:34:32"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <target dev="tap51272942-4b"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/console.log" append="off"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:32:36 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:32:36 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:32:36 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:32:36 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.538 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Preparing to wait for external event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.539 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.540 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.540 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.542 182627 DEBUG nova.virt.libvirt.vif [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-348209217',display_name='tempest-tempest.common.compute-instance-348209217-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-348209217-2',id=94,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-936h1j46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:32Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=4083151b-cb74-4902-b4e9-64b23a3403d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.543 182627 DEBUG nova.network.os_vif_util [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.544 182627 DEBUG nova.network.os_vif_util [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.545 182627 DEBUG os_vif [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.546 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.546 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.547 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.553 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.553 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51272942-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.554 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51272942-4b, col_values=(('external_ids', {'iface-id': '51272942-4bc8-464f-9432-7d95923a10c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:34:32', 'vm-uuid': '4083151b-cb74-4902-b4e9-64b23a3403d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:36 np0005592767 NetworkManager[54973]: <info>  [1769121156.5586] manager: (tap51272942-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.566 182627 INFO os_vif [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b')#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.640 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.641 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.641 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No VIF found with MAC fa:16:3e:ee:34:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.642 182627 INFO nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Using config drive#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.828 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121141.8236759, 66bd6e4e-3db5-45d3-8495-bb100526e6a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.828 182627 INFO nova.compute.manager [-] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:32:36 np0005592767 nova_compute[182623]: 2026-01-22 22:32:36.850 182627 DEBUG nova.compute.manager [None req-55f829f4-56b0-4304-9e71-9181460a9725 - - - - - -] [instance: 66bd6e4e-3db5-45d3-8495-bb100526e6a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.219 182627 INFO nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Creating config drive at /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.config#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.229 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7p3r88a7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.377 182627 DEBUG oslo_concurrency.processutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7p3r88a7" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:37 np0005592767 kernel: tap51272942-4b: entered promiscuous mode
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.4638] manager: (tap51272942-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 22 17:32:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:37Z|00348|binding|INFO|Claiming lport 51272942-4bc8-464f-9432-7d95923a10c5 for this chassis.
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.464 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:37Z|00349|binding|INFO|51272942-4bc8-464f-9432-7d95923a10c5: Claiming fa:16:3e:ee:34:32 10.100.0.6
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.473 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.489 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:34:32 10.100.0.6'], port_security=['fa:16:3e:ee:34:32 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4083151b-cb74-4902-b4e9-64b23a3403d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01c8405dbfef4380888a9355710f3976', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed64c03f-6456-45f9-8629-39125ddd339c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ec884df-aeb7-41d1-a543-87420ce258d1, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=51272942-4bc8-464f-9432-7d95923a10c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.491 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 51272942-4bc8-464f-9432-7d95923a10c5 in datapath f8c52f08-8cb6-4b6d-9351-5c47b120e443 bound to our chassis#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.495 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8c52f08-8cb6-4b6d-9351-5c47b120e443#033[00m
Jan 22 17:32:37 np0005592767 systemd-udevd[224249]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.506 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[38a3bba3-2f9e-41d2-9be5-920f518e7b05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.507 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8c52f08-81 in ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.510 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8c52f08-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.510 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[af290eab-3f19-42da-9e12-b9b901dc3a6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.511 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5196968b-ff28-4fc4-b9a0-fa2d5b86ff9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 systemd-machined[153912]: New machine qemu-45-instance-0000005e.
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.5248] device (tap51272942-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.5260] device (tap51272942-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.526 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[51267fca-1d4a-49ab-8c55-d7e235fac1d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.535 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.540 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.542 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5b0e14-dffc-4093-9134-88db6c3091cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 systemd[1]: Started Virtual Machine qemu-45-instance-0000005e.
Jan 22 17:32:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:37Z|00350|binding|INFO|Setting lport 51272942-4bc8-464f-9432-7d95923a10c5 ovn-installed in OVS
Jan 22 17:32:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:37Z|00351|binding|INFO|Setting lport 51272942-4bc8-464f-9432-7d95923a10c5 up in Southbound
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.546 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.587 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7bfddf78-b9db-493a-ba68-77ed103001b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.5970] manager: (tapf8c52f08-80): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.596 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[49f63aab-ffac-4ede-99c4-233242e8b0a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.627 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3c8cf4-032b-45fd-82ee-18e27bccc586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.630 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5a31a90f-ad2a-4280-ad50-d88b6577edb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.6523] device (tapf8c52f08-80): carrier: link connected
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.653 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4b5c40-b692-4f6b-bc2b-4e0ecfb4d0a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.670 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[40ca0589-d45d-4c3a-b47d-f196004961cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8c52f08-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:b8:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475422, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224283, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.687 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c61973-3d32-460c-b69c-5111e536a5ad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:b838'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475422, 'tstamp': 475422}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224284, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.705 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5c523990-adef-4986-9597-ae46b5b29ce1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8c52f08-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:b8:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475422, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224285, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.740 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbb3ce6-bd2b-473a-912a-81656ca398ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.798 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9cabb2d6-a6cd-4400-96c7-75bd5c0c503a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.799 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8c52f08-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.800 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.801 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8c52f08-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.803 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 NetworkManager[54973]: <info>  [1769121157.8041] manager: (tapf8c52f08-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 22 17:32:37 np0005592767 kernel: tapf8c52f08-80: entered promiscuous mode
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.806 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.808 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8c52f08-80, col_values=(('external_ids', {'iface-id': 'd5406e31-f0da-4e2d-982e-99f7b6960ba7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.809 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:37Z|00352|binding|INFO|Releasing lport d5406e31-f0da-4e2d-982e-99f7b6960ba7 from this chassis (sb_readonly=0)
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.828 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.826 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.828 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.829 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f49b1d45-8843-4523-8d85-3aa7e338caa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.831 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-f8c52f08-8cb6-4b6d-9351-5c47b120e443
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID f8c52f08-8cb6-4b6d-9351-5c47b120e443
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:32:37 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:37.832 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'env', 'PROCESS_TAG=haproxy-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8c52f08-8cb6-4b6d-9351-5c47b120e443.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.847 182627 DEBUG nova.network.neutron [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Updated VIF entry in instance network info cache for port 51272942-4bc8-464f-9432-7d95923a10c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.848 182627 DEBUG nova.network.neutron [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Updating instance_info_cache with network_info: [{"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:37 np0005592767 nova_compute[182623]: 2026-01-22 22:32:37.865 182627 DEBUG oslo_concurrency.lockutils [req-2ab280bd-87c9-4b46-b4e5-30354a787b7b req-3d1d9097-3f7e-4d57-a412-1b8c5d7f0cb0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-4083151b-cb74-4902-b4e9-64b23a3403d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.132 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121158.1312845, 4083151b-cb74-4902-b4e9-64b23a3403d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.133 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] VM Started (Lifecycle Event)#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.160 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.166 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121158.1318483, 4083151b-cb74-4902-b4e9-64b23a3403d8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.166 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.189 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.193 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:32:38 np0005592767 nova_compute[182623]: 2026-01-22 22:32:38.213 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:32:38 np0005592767 podman[224324]: 2026-01-22 22:32:38.247857421 +0000 UTC m=+0.054412470 container create 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 17:32:38 np0005592767 systemd[1]: Started libpod-conmon-68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48.scope.
Jan 22 17:32:38 np0005592767 podman[224324]: 2026-01-22 22:32:38.221372832 +0000 UTC m=+0.027927911 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:32:38 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:32:38 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dec99b468d499d7fdcd00572d96759ebaaf1ea882200ebc458eaf690f5222331/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:32:38 np0005592767 podman[224324]: 2026-01-22 22:32:38.345530785 +0000 UTC m=+0.152085844 container init 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:32:38 np0005592767 podman[224324]: 2026-01-22 22:32:38.350428894 +0000 UTC m=+0.156983943 container start 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:32:38 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [NOTICE]   (224344) : New worker (224346) forked
Jan 22 17:32:38 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [NOTICE]   (224344) : Loading success.
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.777 182627 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.777 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.778 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.778 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.778 182627 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Processing event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.778 182627 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.779 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.779 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.779 182627 DEBUG oslo_concurrency.lockutils [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.779 182627 DEBUG nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] No waiting events found dispatching network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.779 182627 WARNING nova.compute.manager [req-bcbef3d5-36c2-4087-b4d4-4eea022b5b0b req-716b91fc-1292-4901-a131-46509c1167b8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received unexpected event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.780 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.783 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121159.7836764, 4083151b-cb74-4902-b4e9-64b23a3403d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.784 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.786 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.790 182627 INFO nova.virt.libvirt.driver [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Instance spawned successfully.#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.791 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.834 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.835 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.836 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.836 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.837 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.837 182627 DEBUG nova.virt.libvirt.driver [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.841 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.845 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.894 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.991 182627 INFO nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Took 6.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:32:39 np0005592767 nova_compute[182623]: 2026-01-22 22:32:39.992 182627 DEBUG nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:40 np0005592767 nova_compute[182623]: 2026-01-22 22:32:40.094 182627 INFO nova.compute.manager [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Took 7.61 seconds to build instance.#033[00m
Jan 22 17:32:40 np0005592767 nova_compute[182623]: 2026-01-22 22:32:40.118 182627 DEBUG oslo_concurrency.lockutils [None req-3cc633b6-65df-4029-87c8-0f56a448b72a 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:41 np0005592767 nova_compute[182623]: 2026-01-22 22:32:41.250 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:41 np0005592767 nova_compute[182623]: 2026-01-22 22:32:41.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 podman[224355]: 2026-01-22 22:32:43.181576595 +0000 UTC m=+0.087178318 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.652 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.655 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.655 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.656 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.656 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.671 182627 INFO nova.compute.manager [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Terminating instance#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.684 182627 DEBUG nova.compute.manager [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:32:43 np0005592767 kernel: tap51272942-4b (unregistering): left promiscuous mode
Jan 22 17:32:43 np0005592767 NetworkManager[54973]: <info>  [1769121163.7148] device (tap51272942-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:32:43 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:43Z|00353|binding|INFO|Releasing lport 51272942-4bc8-464f-9432-7d95923a10c5 from this chassis (sb_readonly=0)
Jan 22 17:32:43 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:43Z|00354|binding|INFO|Setting lport 51272942-4bc8-464f-9432-7d95923a10c5 down in Southbound
Jan 22 17:32:43 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:43Z|00355|binding|INFO|Removing iface tap51272942-4b ovn-installed in OVS
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.760 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:43.772 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:34:32 10.100.0.6'], port_security=['fa:16:3e:ee:34:32 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4083151b-cb74-4902-b4e9-64b23a3403d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01c8405dbfef4380888a9355710f3976', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed64c03f-6456-45f9-8629-39125ddd339c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ec884df-aeb7-41d1-a543-87420ce258d1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=51272942-4bc8-464f-9432-7d95923a10c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:43.774 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 51272942-4bc8-464f-9432-7d95923a10c5 in datapath f8c52f08-8cb6-4b6d-9351-5c47b120e443 unbound from our chassis#033[00m
Jan 22 17:32:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:43.777 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8c52f08-8cb6-4b6d-9351-5c47b120e443, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:32:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:43.779 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9c935f-3b0e-4cdb-8805-372a4c85eb25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:43.781 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 namespace which is not needed anymore#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.782 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 22 17:32:43 np0005592767 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005e.scope: Consumed 4.557s CPU time.
Jan 22 17:32:43 np0005592767 systemd-machined[153912]: Machine qemu-45-instance-0000005e terminated.
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.954 182627 INFO nova.virt.libvirt.driver [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Instance destroyed successfully.#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.955 182627 DEBUG nova.objects.instance [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'resources' on Instance uuid 4083151b-cb74-4902-b4e9-64b23a3403d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:43 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [NOTICE]   (224344) : haproxy version is 2.8.14-c23fe91
Jan 22 17:32:43 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [NOTICE]   (224344) : path to executable is /usr/sbin/haproxy
Jan 22 17:32:43 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [WARNING]  (224344) : Exiting Master process...
Jan 22 17:32:43 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [ALERT]    (224344) : Current worker (224346) exited with code 143 (Terminated)
Jan 22 17:32:43 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224340]: [WARNING]  (224344) : All workers exited. Exiting... (0)
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.968 182627 DEBUG nova.virt.libvirt.vif [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:32:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-348209217',display_name='tempest-tempest.common.compute-instance-348209217-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-348209217-2',id=94,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-22T22:32:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-936h1j46',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:40Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=4083151b-cb74-4902-b4e9-64b23a3403d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.968 182627 DEBUG nova.network.os_vif_util [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "51272942-4bc8-464f-9432-7d95923a10c5", "address": "fa:16:3e:ee:34:32", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51272942-4b", "ovs_interfaceid": "51272942-4bc8-464f-9432-7d95923a10c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.969 182627 DEBUG nova.network.os_vif_util [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.969 182627 DEBUG os_vif [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:32:43 np0005592767 systemd[1]: libpod-68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48.scope: Deactivated successfully.
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.971 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.972 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51272942-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.973 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.975 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:43 np0005592767 podman[224398]: 2026-01-22 22:32:43.977821474 +0000 UTC m=+0.090663367 container died 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.977 182627 INFO os_vif [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:34:32,bridge_name='br-int',has_traffic_filtering=True,id=51272942-4bc8-464f-9432-7d95923a10c5,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51272942-4b')#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.979 182627 INFO nova.virt.libvirt.driver [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Deleting instance files /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8_del#033[00m
Jan 22 17:32:43 np0005592767 nova_compute[182623]: 2026-01-22 22:32:43.979 182627 INFO nova.virt.libvirt.driver [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Deletion of /var/lib/nova/instances/4083151b-cb74-4902-b4e9-64b23a3403d8_del complete#033[00m
Jan 22 17:32:44 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48-userdata-shm.mount: Deactivated successfully.
Jan 22 17:32:44 np0005592767 systemd[1]: var-lib-containers-storage-overlay-dec99b468d499d7fdcd00572d96759ebaaf1ea882200ebc458eaf690f5222331-merged.mount: Deactivated successfully.
Jan 22 17:32:44 np0005592767 podman[224398]: 2026-01-22 22:32:44.016529439 +0000 UTC m=+0.129371342 container cleanup 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:32:44 np0005592767 systemd[1]: libpod-conmon-68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48.scope: Deactivated successfully.
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.097 182627 INFO nova.compute.manager [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.098 182627 DEBUG oslo.service.loopingcall [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.099 182627 DEBUG nova.compute.manager [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.100 182627 DEBUG nova.network.neutron [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:32:44 np0005592767 podman[224442]: 2026-01-22 22:32:44.104871819 +0000 UTC m=+0.055509512 container remove 68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.111 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f54aeca0-6127-4519-9fb4-13acdebe2faf]: (4, ('Thu Jan 22 10:32:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 (68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48)\n68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48\nThu Jan 22 10:32:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 (68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48)\n68c76a59871953d0a42506dd8837c967baffa7a01421904df4b53808c85f0a48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.113 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[566cd551-2bc4-4f93-a191-0ce29d18b7cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.114 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8c52f08-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:44 np0005592767 kernel: tapf8c52f08-80: left promiscuous mode
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.133 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffba86c-89d7-427a-8580-eb524b1450e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.154 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e447cf-2253-41f8-817e-6f6c22ba3a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.157 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84c3cff4-2028-47a0-a646-e0e8585ef927]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.182 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[685eaeda-1c8d-434a-8532-5bd29044d34c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475416, 'reachable_time': 31171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224457, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.188 182627 DEBUG nova.compute.manager [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-unplugged-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.189 182627 DEBUG oslo_concurrency.lockutils [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.189 182627 DEBUG oslo_concurrency.lockutils [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.189 182627 DEBUG oslo_concurrency.lockutils [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.190 182627 DEBUG nova.compute.manager [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] No waiting events found dispatching network-vif-unplugged-51272942-4bc8-464f-9432-7d95923a10c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:44 np0005592767 nova_compute[182623]: 2026-01-22 22:32:44.190 182627 DEBUG nova.compute.manager [req-f4ddf0f6-574c-4c49-9db4-d5aea234c2af req-04602951-d60f-4cab-bd24-b596c650000a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-unplugged-51272942-4bc8-464f-9432-7d95923a10c5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:32:44 np0005592767 systemd[1]: run-netns-ovnmeta\x2df8c52f08\x2d8cb6\x2d4b6d\x2d9351\x2d5c47b120e443.mount: Deactivated successfully.
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.189 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:32:44 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:44.189 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[cb82437c-ac6e-40c9-a41f-0a2e9581579e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.884 182627 DEBUG nova.compute.manager [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.885 182627 DEBUG oslo_concurrency.lockutils [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.885 182627 DEBUG oslo_concurrency.lockutils [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.886 182627 DEBUG oslo_concurrency.lockutils [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.886 182627 DEBUG nova.compute.manager [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] No waiting events found dispatching network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:32:46 np0005592767 nova_compute[182623]: 2026-01-22 22:32:46.887 182627 WARNING nova.compute.manager [req-70b916c1-0ca7-4727-b049-13bcf619b24b req-c26537ee-a4af-4bdd-8fbd-2c3d96186c52 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received unexpected event network-vif-plugged-51272942-4bc8-464f-9432-7d95923a10c5 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.664 182627 DEBUG nova.network.neutron [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.688 182627 INFO nova.compute.manager [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Took 3.59 seconds to deallocate network for instance.#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.798 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.799 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.882 182627 DEBUG nova.compute.provider_tree [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.909 182627 DEBUG nova.scheduler.client.report [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:47 np0005592767 nova_compute[182623]: 2026-01-22 22:32:47.943 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:48 np0005592767 nova_compute[182623]: 2026-01-22 22:32:48.007 182627 INFO nova.scheduler.client.report [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Deleted allocations for instance 4083151b-cb74-4902-b4e9-64b23a3403d8#033[00m
Jan 22 17:32:48 np0005592767 nova_compute[182623]: 2026-01-22 22:32:48.110 182627 DEBUG oslo_concurrency.lockutils [None req-a818132e-2ad0-484e-ad25-a616c6a2fe8c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "4083151b-cb74-4902-b4e9-64b23a3403d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:48 np0005592767 nova_compute[182623]: 2026-01-22 22:32:48.973 182627 DEBUG nova.compute.manager [req-9f0caa80-c5e1-47b5-9837-d709937ba655 req-ef6c9c2a-1265-41f9-8250-d376a121ef69 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Received event network-vif-deleted-51272942-4bc8-464f-9432-7d95923a10c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:48 np0005592767 nova_compute[182623]: 2026-01-22 22:32:48.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:50 np0005592767 podman[224458]: 2026-01-22 22:32:50.181802958 +0000 UTC m=+0.102246934 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:32:50 np0005592767 podman[224459]: 2026-01-22 22:32:50.181637093 +0000 UTC m=+0.096757928 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Jan 22 17:32:51 np0005592767 nova_compute[182623]: 2026-01-22 22:32:51.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.489 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.489 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.516 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.658 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.659 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.665 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.665 182627 INFO nova.compute.claims [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.792 182627 DEBUG nova.compute.provider_tree [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.817 182627 DEBUG nova.scheduler.client.report [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.850 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.852 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.923 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.924 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.950 182627 INFO nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:32:52 np0005592767 nova_compute[182623]: 2026-01-22 22:32:52.975 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.106 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.108 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.109 182627 INFO nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Creating image(s)#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.110 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.111 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.112 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.141 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.215 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.216 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.217 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.229 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.285 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.286 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.343 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.344 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.345 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.405 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.406 182627 DEBUG nova.virt.disk.api [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Checking if we can resize image /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.407 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.495 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.497 182627 DEBUG nova.virt.disk.api [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Cannot resize image /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.498 182627 DEBUG nova.objects.instance [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'migration_context' on Instance uuid bf6e2992-6fe6-4c4c-ad16-a5342029c966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.516 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.517 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Ensure instance console log exists: /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.517 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.518 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.518 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.891 182627 DEBUG nova.policy [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3738d2d62baa4adc84f010ecf9eda9ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '01c8405dbfef4380888a9355710f3976', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:32:53 np0005592767 nova_compute[182623]: 2026-01-22 22:32:53.979 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:54 np0005592767 podman[224521]: 2026-01-22 22:32:54.150922995 +0000 UTC m=+0.061912242 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 17:32:54 np0005592767 podman[224522]: 2026-01-22 22:32:54.203520254 +0000 UTC m=+0.099498037 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:32:54 np0005592767 nova_compute[182623]: 2026-01-22 22:32:54.541 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Successfully created port: 6e95257d-fd9a-4ffd-a45b-9081471843c8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.530 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Successfully updated port: 6e95257d-fd9a-4ffd-a45b-9081471843c8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.562 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.563 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquired lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.563 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.659 182627 DEBUG nova.compute.manager [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-changed-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.660 182627 DEBUG nova.compute.manager [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Refreshing instance network info cache due to event network-changed-6e95257d-fd9a-4ffd-a45b-9081471843c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.660 182627 DEBUG oslo_concurrency.lockutils [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:32:55 np0005592767 nova_compute[182623]: 2026-01-22 22:32:55.740 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.299 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.856 182627 DEBUG nova.network.neutron [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Updating instance_info_cache with network_info: [{"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.879 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Releasing lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.879 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Instance network_info: |[{"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.879 182627 DEBUG oslo_concurrency.lockutils [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.880 182627 DEBUG nova.network.neutron [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Refreshing network info cache for port 6e95257d-fd9a-4ffd-a45b-9081471843c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.882 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Start _get_guest_xml network_info=[{"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.887 182627 WARNING nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.894 182627 DEBUG nova.virt.libvirt.host [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.895 182627 DEBUG nova.virt.libvirt.host [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.899 182627 DEBUG nova.virt.libvirt.host [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.899 182627 DEBUG nova.virt.libvirt.host [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.900 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.901 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.901 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.901 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.901 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.902 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.902 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.902 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.902 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.902 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.903 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.903 182627 DEBUG nova.virt.hardware [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.906 182627 DEBUG nova.virt.libvirt.vif [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1309066083',display_name='tempest-MultipleCreateTestJSON-server-1309066083-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1309066083-1',id=96,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-0ibae45s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:53Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=bf6e2992-6fe6-4c4c-ad16-a5342029c966,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.906 182627 DEBUG nova.network.os_vif_util [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.907 182627 DEBUG nova.network.os_vif_util [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.908 182627 DEBUG nova.objects.instance [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf6e2992-6fe6-4c4c-ad16-a5342029c966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.938 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <uuid>bf6e2992-6fe6-4c4c-ad16-a5342029c966</uuid>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <name>instance-00000060</name>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:name>tempest-MultipleCreateTestJSON-server-1309066083-1</nova:name>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:32:56</nova:creationTime>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:user uuid="3738d2d62baa4adc84f010ecf9eda9ec">tempest-MultipleCreateTestJSON-1123278408-project-member</nova:user>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:project uuid="01c8405dbfef4380888a9355710f3976">tempest-MultipleCreateTestJSON-1123278408</nova:project>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        <nova:port uuid="6e95257d-fd9a-4ffd-a45b-9081471843c8">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="serial">bf6e2992-6fe6-4c4c-ad16-a5342029c966</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="uuid">bf6e2992-6fe6-4c4c-ad16-a5342029c966</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.config"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b6:6c:0d"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <target dev="tap6e95257d-fd"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/console.log" append="off"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:32:56 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:32:56 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:32:56 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:32:56 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.939 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Preparing to wait for external event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.940 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.940 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.940 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.941 182627 DEBUG nova.virt.libvirt.vif [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1309066083',display_name='tempest-MultipleCreateTestJSON-server-1309066083-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1309066083-1',id=96,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-0ibae45s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:32:53Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=bf6e2992-6fe6-4c4c-ad16-a5342029c966,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.941 182627 DEBUG nova.network.os_vif_util [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.942 182627 DEBUG nova.network.os_vif_util [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.942 182627 DEBUG os_vif [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.942 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.943 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.943 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.945 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.945 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e95257d-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.946 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e95257d-fd, col_values=(('external_ids', {'iface-id': '6e95257d-fd9a-4ffd-a45b-9081471843c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:6c:0d', 'vm-uuid': 'bf6e2992-6fe6-4c4c-ad16-a5342029c966'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.947 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:56 np0005592767 NetworkManager[54973]: <info>  [1769121176.9488] manager: (tap6e95257d-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.949 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.956 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:56 np0005592767 nova_compute[182623]: 2026-01-22 22:32:56.957 182627 INFO os_vif [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd')#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.031 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.031 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.032 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] No VIF found with MAC fa:16:3e:b6:6c:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.032 182627 INFO nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Using config drive#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.947 182627 INFO nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Creating config drive at /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.config#033[00m
Jan 22 17:32:57 np0005592767 nova_compute[182623]: 2026-01-22 22:32:57.952 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxquik7za execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.082 182627 DEBUG oslo_concurrency.processutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxquik7za" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:32:58 np0005592767 kernel: tap6e95257d-fd: entered promiscuous mode
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.1557] manager: (tap6e95257d-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:58Z|00356|binding|INFO|Claiming lport 6e95257d-fd9a-4ffd-a45b-9081471843c8 for this chassis.
Jan 22 17:32:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:58Z|00357|binding|INFO|6e95257d-fd9a-4ffd-a45b-9081471843c8: Claiming fa:16:3e:b6:6c:0d 10.100.0.4
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.172 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:6c:0d 10.100.0.4'], port_security=['fa:16:3e:b6:6c:0d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bf6e2992-6fe6-4c4c-ad16-a5342029c966', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01c8405dbfef4380888a9355710f3976', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed64c03f-6456-45f9-8629-39125ddd339c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ec884df-aeb7-41d1-a543-87420ce258d1, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6e95257d-fd9a-4ffd-a45b-9081471843c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.173 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6e95257d-fd9a-4ffd-a45b-9081471843c8 in datapath f8c52f08-8cb6-4b6d-9351-5c47b120e443 bound to our chassis#033[00m
Jan 22 17:32:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:58Z|00358|binding|INFO|Setting lport 6e95257d-fd9a-4ffd-a45b-9081471843c8 ovn-installed in OVS
Jan 22 17:32:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:58Z|00359|binding|INFO|Setting lport 6e95257d-fd9a-4ffd-a45b-9081471843c8 up in Southbound
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.175 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8c52f08-8cb6-4b6d-9351-5c47b120e443#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 systemd-udevd[224583]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.189 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[658fe1f8-b6b0-4189-aa97-50f75a000a9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.190 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8c52f08-81 in ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.192 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8c52f08-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.193 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c083c3b9-aeb0-4b67-91d3-5fbb62d6e2ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.194 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8805c1ce-d591-45ba-9ea7-705a9a20d1c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 systemd-machined[153912]: New machine qemu-46-instance-00000060.
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.2074] device (tap6e95257d-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.2084] device (tap6e95257d-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.208 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a3360b8b-2947-4fb5-9bb9-308289a42ead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 systemd[1]: Started Virtual Machine qemu-46-instance-00000060.
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.230 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f758753e-62ae-44ed-bd44-41feb5111ffb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.258 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f73973-a883-46fc-a879-a79ef0fec75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.2647] manager: (tapf8c52f08-80): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.265 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a32ac902-626b-46aa-a116-9ba709b329c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.301 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0a4bdd-8c9d-4ade-8f80-3bab4fb97720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.305 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b83fc448-518a-4fd3-b36e-ac653fb0680d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.3317] device (tapf8c52f08-80): carrier: link connected
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.335 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f431dd6e-9596-487d-a123-e2250548a0ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.349 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1312eff6-2af7-41d3-b3ec-5666dc8a4ef3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8c52f08-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:b8:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477491, 'reachable_time': 38534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224617, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.366 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7deb0319-ce05-4ea2-9e2a-ccfb06d19461]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:b838'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477491, 'tstamp': 477491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224618, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.388 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[44a0f218-1795-4163-afde-2f8957824c53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8c52f08-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:b8:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477491, 'reachable_time': 38534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224619, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.422 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a96919e4-e8a4-4a2e-8d1b-390058986126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.463 182627 DEBUG nova.compute.manager [req-b36d5d5d-8492-4f24-856e-e5c1b97b5563 req-34cb7fcb-922a-4af3-bce1-c15add6f1806 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.464 182627 DEBUG oslo_concurrency.lockutils [req-b36d5d5d-8492-4f24-856e-e5c1b97b5563 req-34cb7fcb-922a-4af3-bce1-c15add6f1806 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.464 182627 DEBUG oslo_concurrency.lockutils [req-b36d5d5d-8492-4f24-856e-e5c1b97b5563 req-34cb7fcb-922a-4af3-bce1-c15add6f1806 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.465 182627 DEBUG oslo_concurrency.lockutils [req-b36d5d5d-8492-4f24-856e-e5c1b97b5563 req-34cb7fcb-922a-4af3-bce1-c15add6f1806 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.465 182627 DEBUG nova.compute.manager [req-b36d5d5d-8492-4f24-856e-e5c1b97b5563 req-34cb7fcb-922a-4af3-bce1-c15add6f1806 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Processing event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.505 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1490d364-d944-4229-822d-d47aa0681425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.507 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8c52f08-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.507 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.508 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8c52f08-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.511 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 NetworkManager[54973]: <info>  [1769121178.5125] manager: (tapf8c52f08-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 22 17:32:58 np0005592767 kernel: tapf8c52f08-80: entered promiscuous mode
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.515 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.519 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8c52f08-80, col_values=(('external_ids', {'iface-id': 'd5406e31-f0da-4e2d-982e-99f7b6960ba7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.521 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:32:58Z|00360|binding|INFO|Releasing lport d5406e31-f0da-4e2d-982e-99f7b6960ba7 from this chassis (sb_readonly=0)
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.540 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.542 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2bf860-98c9-4797-a082-b98005c866a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.543 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-f8c52f08-8cb6-4b6d-9351-5c47b120e443
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/f8c52f08-8cb6-4b6d-9351-5c47b120e443.pid.haproxy
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID f8c52f08-8cb6-4b6d-9351-5c47b120e443
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.544 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'env', 'PROCESS_TAG=haproxy-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8c52f08-8cb6-4b6d-9351-5c47b120e443.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.610 182627 DEBUG nova.network.neutron [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Updated VIF entry in instance network info cache for port 6e95257d-fd9a-4ffd-a45b-9081471843c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.610 182627 DEBUG nova.network.neutron [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Updating instance_info_cache with network_info: [{"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.627 182627 DEBUG oslo_concurrency.lockutils [req-51394559-d1ce-4066-9802-bcd837e82f8c req-b67e7689-20a6-4fdf-a453-f9f435cac866 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bf6e2992-6fe6-4c4c-ad16-a5342029c966" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.738 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121178.7378135, bf6e2992-6fe6-4c4c-ad16-a5342029c966 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.739 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] VM Started (Lifecycle Event)#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.742 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.747 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.751 182627 INFO nova.virt.libvirt.driver [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Instance spawned successfully.#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.752 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.760 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.764 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.780 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.780 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.781 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.782 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.782 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.783 182627 DEBUG nova.virt.libvirt.driver [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.788 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.788 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121178.7391524, bf6e2992-6fe6-4c4c-ad16-a5342029c966 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.788 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.834 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.838 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121178.7470286, bf6e2992-6fe6-4c4c-ad16-a5342029c966 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.839 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.857 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.862 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.879 182627 INFO nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Took 5.77 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.880 182627 DEBUG nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.883 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:32:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:58.900 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.901 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:32:58 np0005592767 podman[224656]: 2026-01-22 22:32:58.919868995 +0000 UTC m=+0.054961346 container create e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.953 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121163.9518578, 4083151b-cb74-4902-b4e9-64b23a3403d8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.954 182627 INFO nova.compute.manager [-] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:32:58 np0005592767 systemd[1]: Started libpod-conmon-e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255.scope.
Jan 22 17:32:58 np0005592767 podman[224656]: 2026-01-22 22:32:58.887256713 +0000 UTC m=+0.022349084 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:32:58 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:32:58 np0005592767 nova_compute[182623]: 2026-01-22 22:32:58.990 182627 DEBUG nova.compute.manager [None req-79c8e7a1-e8de-4775-ac24-8127435a07a8 - - - - - -] [instance: 4083151b-cb74-4902-b4e9-64b23a3403d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:32:58 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60e7ea9dc71d98b35a35778cac038ac8002c46dd55eccffd19dbf094fb356b2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:32:59 np0005592767 podman[224656]: 2026-01-22 22:32:59.004710626 +0000 UTC m=+0.139803007 container init e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:32:59 np0005592767 nova_compute[182623]: 2026-01-22 22:32:59.008 182627 INFO nova.compute.manager [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Took 6.42 seconds to build instance.#033[00m
Jan 22 17:32:59 np0005592767 podman[224656]: 2026-01-22 22:32:59.010531611 +0000 UTC m=+0.145623962 container start e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:32:59 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [NOTICE]   (224675) : New worker (224677) forked
Jan 22 17:32:59 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [NOTICE]   (224675) : Loading success.
Jan 22 17:32:59 np0005592767 nova_compute[182623]: 2026-01-22 22:32:59.048 182627 DEBUG oslo_concurrency.lockutils [None req-bb251236-f803-4318-aea4-1471d861920b 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:32:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:32:59.074 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.759 182627 DEBUG nova.compute.manager [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.759 182627 DEBUG oslo_concurrency.lockutils [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.760 182627 DEBUG oslo_concurrency.lockutils [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.760 182627 DEBUG oslo_concurrency.lockutils [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.760 182627 DEBUG nova.compute.manager [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] No waiting events found dispatching network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:00 np0005592767 nova_compute[182623]: 2026-01-22 22:33:00.761 182627 WARNING nova.compute.manager [req-8dda4cdd-529d-4389-bec5-a7822f076476 req-d139be76-c006-4de1-ac11-50173eccfc62 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received unexpected event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:33:01 np0005592767 nova_compute[182623]: 2026-01-22 22:33:01.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:01 np0005592767 nova_compute[182623]: 2026-01-22 22:33:01.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:02 np0005592767 podman[224686]: 2026-01-22 22:33:02.15660928 +0000 UTC m=+0.072586595 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.307 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.862 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.863 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.863 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.863 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.863 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.874 182627 INFO nova.compute.manager [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Terminating instance#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.885 182627 DEBUG nova.compute.manager [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:33:06 np0005592767 kernel: tap6e95257d-fd (unregistering): left promiscuous mode
Jan 22 17:33:06 np0005592767 NetworkManager[54973]: <info>  [1769121186.9093] device (tap6e95257d-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:33:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:06Z|00361|binding|INFO|Releasing lport 6e95257d-fd9a-4ffd-a45b-9081471843c8 from this chassis (sb_readonly=0)
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.956 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:06Z|00362|binding|INFO|Setting lport 6e95257d-fd9a-4ffd-a45b-9081471843c8 down in Southbound
Jan 22 17:33:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:06Z|00363|binding|INFO|Removing iface tap6e95257d-fd ovn-installed in OVS
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.959 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.959 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:06 np0005592767 nova_compute[182623]: 2026-01-22 22:33:06.971 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:06.971 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:6c:0d 10.100.0.4'], port_security=['fa:16:3e:b6:6c:0d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bf6e2992-6fe6-4c4c-ad16-a5342029c966', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01c8405dbfef4380888a9355710f3976', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed64c03f-6456-45f9-8629-39125ddd339c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ec884df-aeb7-41d1-a543-87420ce258d1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6e95257d-fd9a-4ffd-a45b-9081471843c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:06.974 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6e95257d-fd9a-4ffd-a45b-9081471843c8 in datapath f8c52f08-8cb6-4b6d-9351-5c47b120e443 unbound from our chassis#033[00m
Jan 22 17:33:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:06.976 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8c52f08-8cb6-4b6d-9351-5c47b120e443, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:33:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:06.978 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e9dbf152-bfd3-4966-9dd0-68da767ec95f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:06.979 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 namespace which is not needed anymore#033[00m
Jan 22 17:33:06 np0005592767 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000060.scope: Deactivated successfully.
Jan 22 17:33:06 np0005592767 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000060.scope: Consumed 8.811s CPU time.
Jan 22 17:33:06 np0005592767 systemd-machined[153912]: Machine qemu-46-instance-00000060 terminated.
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.076 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [NOTICE]   (224675) : haproxy version is 2.8.14-c23fe91
Jan 22 17:33:07 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [NOTICE]   (224675) : path to executable is /usr/sbin/haproxy
Jan 22 17:33:07 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [WARNING]  (224675) : Exiting Master process...
Jan 22 17:33:07 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [ALERT]    (224675) : Current worker (224677) exited with code 143 (Terminated)
Jan 22 17:33:07 np0005592767 neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443[224671]: [WARNING]  (224675) : All workers exited. Exiting... (0)
Jan 22 17:33:07 np0005592767 systemd[1]: libpod-e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255.scope: Deactivated successfully.
Jan 22 17:33:07 np0005592767 podman[224734]: 2026-01-22 22:33:07.161961459 +0000 UTC m=+0.062481409 container died e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.162 182627 INFO nova.virt.libvirt.driver [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Instance destroyed successfully.#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.163 182627 DEBUG nova.objects.instance [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lazy-loading 'resources' on Instance uuid bf6e2992-6fe6-4c4c-ad16-a5342029c966 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.180 182627 DEBUG nova.virt.libvirt.vif [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1309066083',display_name='tempest-MultipleCreateTestJSON-server-1309066083-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1309066083-1',id=96,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:32:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01c8405dbfef4380888a9355710f3976',ramdisk_id='',reservation_id='r-0ibae45s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1123278408',owner_user_name='tempest-MultipleCreateTestJSON-1123278408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:32:58Z,user_data=None,user_id='3738d2d62baa4adc84f010ecf9eda9ec',uuid=bf6e2992-6fe6-4c4c-ad16-a5342029c966,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.180 182627 DEBUG nova.network.os_vif_util [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converting VIF {"id": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "address": "fa:16:3e:b6:6c:0d", "network": {"id": "f8c52f08-8cb6-4b6d-9351-5c47b120e443", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-328875417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "01c8405dbfef4380888a9355710f3976", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e95257d-fd", "ovs_interfaceid": "6e95257d-fd9a-4ffd-a45b-9081471843c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.181 182627 DEBUG nova.network.os_vif_util [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.182 182627 DEBUG os_vif [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.184 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.185 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e95257d-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.186 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.188 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:07 np0005592767 systemd[1]: var-lib-containers-storage-overlay-60e7ea9dc71d98b35a35778cac038ac8002c46dd55eccffd19dbf094fb356b2a-merged.mount: Deactivated successfully.
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.191 182627 INFO os_vif [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:6c:0d,bridge_name='br-int',has_traffic_filtering=True,id=6e95257d-fd9a-4ffd-a45b-9081471843c8,network=Network(f8c52f08-8cb6-4b6d-9351-5c47b120e443),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e95257d-fd')#033[00m
Jan 22 17:33:07 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255-userdata-shm.mount: Deactivated successfully.
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.192 182627 INFO nova.virt.libvirt.driver [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Deleting instance files /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966_del#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.192 182627 INFO nova.virt.libvirt.driver [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Deletion of /var/lib/nova/instances/bf6e2992-6fe6-4c4c-ad16-a5342029c966_del complete#033[00m
Jan 22 17:33:07 np0005592767 podman[224734]: 2026-01-22 22:33:07.198158843 +0000 UTC m=+0.098678793 container cleanup e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:33:07 np0005592767 systemd[1]: libpod-conmon-e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255.scope: Deactivated successfully.
Jan 22 17:33:07 np0005592767 podman[224774]: 2026-01-22 22:33:07.262093602 +0000 UTC m=+0.044053887 container remove e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.266 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb03713-c675-4f15-8ca2-f19bd8d5a822]: (4, ('Thu Jan 22 10:33:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 (e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255)\ne69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255\nThu Jan 22 10:33:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 (e69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255)\ne69da5e026d7c7c03526eaa595a79d2591b491519b03842d633dcf1217f70255\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.268 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc6e754-9ff2-4c7c-a2ed-b27b7178dd73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.269 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8c52f08-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:07 np0005592767 kernel: tapf8c52f08-80: left promiscuous mode
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.274 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.279 182627 INFO nova.compute.manager [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.280 182627 DEBUG oslo.service.loopingcall [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.280 182627 DEBUG nova.compute.manager [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.280 182627 DEBUG nova.network.neutron [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.286 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d2a841-2a56-4827-ab1e-27da61eddb81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.301 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[68dcd5de-973f-4f94-814d-4e82a5dabf65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.303 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6aa0ed-2348-4c02-9a7a-829dc130fad8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.317 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc3c4df-81a7-4f19-8896-fbad7d931e6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477483, 'reachable_time': 25142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224793, 'error': None, 'target': 'ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.319 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8c52f08-8cb6-4b6d-9351-5c47b120e443 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:33:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:07.319 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ad88e9-2b78-4c41-9b13-b0378c0a7b9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:07 np0005592767 systemd[1]: run-netns-ovnmeta\x2df8c52f08\x2d8cb6\x2d4b6d\x2d9351\x2d5c47b120e443.mount: Deactivated successfully.
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.867 182627 DEBUG nova.compute.manager [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-unplugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.868 182627 DEBUG oslo_concurrency.lockutils [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.868 182627 DEBUG oslo_concurrency.lockutils [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.869 182627 DEBUG oslo_concurrency.lockutils [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.869 182627 DEBUG nova.compute.manager [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] No waiting events found dispatching network-vif-unplugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:07 np0005592767 nova_compute[182623]: 2026-01-22 22:33:07.869 182627 DEBUG nova.compute.manager [req-d82a540c-c786-47f4-8687-a1ba8507f426 req-0662a18d-badb-4757-bbec-569918d0340d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-unplugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.235 182627 DEBUG nova.network.neutron [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.252 182627 INFO nova.compute.manager [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Took 0.97 seconds to deallocate network for instance.#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.327 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.328 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.402 182627 DEBUG nova.compute.provider_tree [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.424 182627 DEBUG nova.scheduler.client.report [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.462 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.487 182627 INFO nova.scheduler.client.report [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Deleted allocations for instance bf6e2992-6fe6-4c4c-ad16-a5342029c966#033[00m
Jan 22 17:33:08 np0005592767 nova_compute[182623]: 2026-01-22 22:33:08.569 182627 DEBUG oslo_concurrency.lockutils [None req-94cb1558-8813-4c99-9ff7-fbc02753cb1c 3738d2d62baa4adc84f010ecf9eda9ec 01c8405dbfef4380888a9355710f3976 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.167 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.167 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.190 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.299 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.300 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.308 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.309 182627 INFO nova.compute.claims [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.427 182627 DEBUG nova.compute.provider_tree [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.441 182627 DEBUG nova.scheduler.client.report [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.460 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.460 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.521 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.522 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.543 182627 INFO nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.561 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.677 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.680 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.680 182627 INFO nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Creating image(s)#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.682 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.682 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.684 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.717 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.786 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.788 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.789 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.811 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.870 182627 DEBUG nova.policy [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9cae4b8c69234f14817ee89dfe745644', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e2f582b87f84b00b5742f191c2b04af', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.875 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.876 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.921 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.923 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.923 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.993 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.994 182627 DEBUG nova.virt.disk.api [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Checking if we can resize image /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:33:09 np0005592767 nova_compute[182623]: 2026-01-22 22:33:09.995 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.057 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.058 182627 DEBUG nova.virt.disk.api [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Cannot resize image /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.059 182627 DEBUG nova.objects.instance [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lazy-loading 'migration_context' on Instance uuid b7c14ca2-f444-48b4-b9e7-744e2cd63c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.073 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.074 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Ensure instance console log exists: /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.075 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.075 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.076 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.123 182627 DEBUG nova.compute.manager [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.123 182627 DEBUG oslo_concurrency.lockutils [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.124 182627 DEBUG oslo_concurrency.lockutils [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.124 182627 DEBUG oslo_concurrency.lockutils [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bf6e2992-6fe6-4c4c-ad16-a5342029c966-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.124 182627 DEBUG nova.compute.manager [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] No waiting events found dispatching network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.124 182627 WARNING nova.compute.manager [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received unexpected event network-vif-plugged-6e95257d-fd9a-4ffd-a45b-9081471843c8 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:33:10 np0005592767 nova_compute[182623]: 2026-01-22 22:33:10.124 182627 DEBUG nova.compute.manager [req-4adf277c-dcab-4fb9-8a8f-6d6f32ab6e6c req-0924ed8a-ce5d-4157-a640-1967e43169e5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Received event network-vif-deleted-6e95257d-fd9a-4ffd-a45b-9081471843c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:11 np0005592767 nova_compute[182623]: 2026-01-22 22:33:11.098 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Successfully created port: 6c400807-156f-43fe-8e7b-816d56d58dae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:33:11 np0005592767 nova_compute[182623]: 2026-01-22 22:33:11.311 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:12.104 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:12.105 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:12.105 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.187 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.293 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Successfully updated port: 6c400807-156f-43fe-8e7b-816d56d58dae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.320 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.321 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquired lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.321 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.502 182627 DEBUG nova.compute.manager [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-changed-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.503 182627 DEBUG nova.compute.manager [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Refreshing instance network info cache due to event network-changed-6c400807-156f-43fe-8e7b-816d56d58dae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.503 182627 DEBUG oslo_concurrency.lockutils [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:12 np0005592767 nova_compute[182623]: 2026-01-22 22:33:12.553 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.664 182627 DEBUG nova.network.neutron [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Updating instance_info_cache with network_info: [{"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.683 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Releasing lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.683 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Instance network_info: |[{"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.684 182627 DEBUG oslo_concurrency.lockutils [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.684 182627 DEBUG nova.network.neutron [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Refreshing network info cache for port 6c400807-156f-43fe-8e7b-816d56d58dae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.688 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Start _get_guest_xml network_info=[{"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.694 182627 WARNING nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.699 182627 DEBUG nova.virt.libvirt.host [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.699 182627 DEBUG nova.virt.libvirt.host [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.704 182627 DEBUG nova.virt.libvirt.host [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.705 182627 DEBUG nova.virt.libvirt.host [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.706 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.707 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.707 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.707 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.708 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.708 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.708 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.708 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.709 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.709 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.709 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.709 182627 DEBUG nova.virt.hardware [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.713 182627 DEBUG nova.virt.libvirt.vif [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-281089716',display_name='tempest-ServerAddressesTestJSON-server-281089716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-281089716',id=98,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e2f582b87f84b00b5742f191c2b04af',ramdisk_id='',reservation_id='r-8uknps3w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1452571470',owner_user_name='tempest-ServerAddressesTestJSON-1452571470-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:09Z,user_data=None,user_id='9cae4b8c69234f14817ee89dfe745644',uuid=b7c14ca2-f444-48b4-b9e7-744e2cd63c78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.713 182627 DEBUG nova.network.os_vif_util [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converting VIF {"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.714 182627 DEBUG nova.network.os_vif_util [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.715 182627 DEBUG nova.objects.instance [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lazy-loading 'pci_devices' on Instance uuid b7c14ca2-f444-48b4-b9e7-744e2cd63c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.732 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <uuid>b7c14ca2-f444-48b4-b9e7-744e2cd63c78</uuid>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <name>instance-00000062</name>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerAddressesTestJSON-server-281089716</nova:name>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:33:13</nova:creationTime>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:user uuid="9cae4b8c69234f14817ee89dfe745644">tempest-ServerAddressesTestJSON-1452571470-project-member</nova:user>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:project uuid="0e2f582b87f84b00b5742f191c2b04af">tempest-ServerAddressesTestJSON-1452571470</nova:project>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        <nova:port uuid="6c400807-156f-43fe-8e7b-816d56d58dae">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="serial">b7c14ca2-f444-48b4-b9e7-744e2cd63c78</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="uuid">b7c14ca2-f444-48b4-b9e7-744e2cd63c78</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.config"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:8c:a9:e3"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <target dev="tap6c400807-15"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/console.log" append="off"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:33:13 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:33:13 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:33:13 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:33:13 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.734 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Preparing to wait for external event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.734 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.734 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.734 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.735 182627 DEBUG nova.virt.libvirt.vif [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-281089716',display_name='tempest-ServerAddressesTestJSON-server-281089716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-281089716',id=98,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e2f582b87f84b00b5742f191c2b04af',ramdisk_id='',reservation_id='r-8uknps3w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1452571470',owner_user_name='tempest-ServerAddressesTestJSON-1452571470-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:09Z,user_data=None,user_id='9cae4b8c69234f14817ee89dfe745644',uuid=b7c14ca2-f444-48b4-b9e7-744e2cd63c78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.736 182627 DEBUG nova.network.os_vif_util [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converting VIF {"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.736 182627 DEBUG nova.network.os_vif_util [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.737 182627 DEBUG os_vif [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.738 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.738 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.740 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.741 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c400807-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.741 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c400807-15, col_values=(('external_ids', {'iface-id': '6c400807-156f-43fe-8e7b-816d56d58dae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:a9:e3', 'vm-uuid': 'b7c14ca2-f444-48b4-b9e7-744e2cd63c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.743 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:13 np0005592767 NetworkManager[54973]: <info>  [1769121193.7440] manager: (tap6c400807-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.745 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.753 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.754 182627 INFO os_vif [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15')#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.841 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.842 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.842 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] No VIF found with MAC fa:16:3e:8c:a9:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:33:13 np0005592767 nova_compute[182623]: 2026-01-22 22:33:13.843 182627 INFO nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Using config drive#033[00m
Jan 22 17:33:13 np0005592767 podman[224812]: 2026-01-22 22:33:13.876679505 +0000 UTC m=+0.080424787 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.397 182627 INFO nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Creating config drive at /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.config#033[00m
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.402 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdxdnesaq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.548 182627 DEBUG oslo_concurrency.processutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdxdnesaq" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:14 np0005592767 kernel: tap6c400807-15: entered promiscuous mode
Jan 22 17:33:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:14Z|00364|binding|INFO|Claiming lport 6c400807-156f-43fe-8e7b-816d56d58dae for this chassis.
Jan 22 17:33:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:14Z|00365|binding|INFO|6c400807-156f-43fe-8e7b-816d56d58dae: Claiming fa:16:3e:8c:a9:e3 10.100.0.3
Jan 22 17:33:14 np0005592767 NetworkManager[54973]: <info>  [1769121194.6344] manager: (tap6c400807-15): new Tun device (/org/freedesktop/NetworkManager/Devices/174)
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.632 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.643 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:a9:e3 10.100.0.3'], port_security=['fa:16:3e:8c:a9:e3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b7c14ca2-f444-48b4-b9e7-744e2cd63c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e2f582b87f84b00b5742f191c2b04af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09ab7e27-7ca6-490b-a9df-74d647171fe1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=138e67dc-cbe9-4444-a91b-e1bade9c871a, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6c400807-156f-43fe-8e7b-816d56d58dae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.646 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6c400807-156f-43fe-8e7b-816d56d58dae in datapath b8f70305-a2df-4443-b7a1-0b4881c0c517 bound to our chassis#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.649 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8f70305-a2df-4443-b7a1-0b4881c0c517#033[00m
Jan 22 17:33:14 np0005592767 systemd-udevd[224848]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.674 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c39d6b0-8f39-4c5a-afe0-212cce4e55f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.676 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8f70305-a1 in ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.678 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8f70305-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.678 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0ab4d7-1b86-463b-9944-fead817ec423]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.679 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c71e121a-4e67-4602-998b-8c915ba7b95f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 NetworkManager[54973]: <info>  [1769121194.6844] device (tap6c400807-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:33:14 np0005592767 NetworkManager[54973]: <info>  [1769121194.6852] device (tap6c400807-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.689 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:14Z|00366|binding|INFO|Setting lport 6c400807-156f-43fe-8e7b-816d56d58dae ovn-installed in OVS
Jan 22 17:33:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:14Z|00367|binding|INFO|Setting lport 6c400807-156f-43fe-8e7b-816d56d58dae up in Southbound
Jan 22 17:33:14 np0005592767 nova_compute[182623]: 2026-01-22 22:33:14.695 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.699 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e8376ab9-ad4d-4e2f-a566-fa0dd37d8ddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 systemd-machined[153912]: New machine qemu-47-instance-00000062.
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.716 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[54125046-6509-4c9b-addd-8947f5ce2345]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 systemd[1]: Started Virtual Machine qemu-47-instance-00000062.
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.756 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1fb027-d602-40d6-8ac9-3ab6095aad62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 NetworkManager[54973]: <info>  [1769121194.7654] manager: (tapb8f70305-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/175)
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.764 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03f6bc55-598d-4ebb-9529-fc3405f5b8f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.799 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[96600fee-85e1-46ab-b0be-9b139dbe51bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.803 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[49526912-333a-4074-bf41-fd63034a0cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 NetworkManager[54973]: <info>  [1769121194.8388] device (tapb8f70305-a0): carrier: link connected
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.849 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff282d2-c73a-4ed5-be54-40353d2a0ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.873 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0a22f022-47fc-42b8-89d6-c8cef4baf1c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8f70305-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:18:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479141, 'reachable_time': 26435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224883, 'error': None, 'target': 'ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.889 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4b71b06d-d76a-484a-9922-017b3240d416]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:18ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479141, 'tstamp': 479141}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224884, 'error': None, 'target': 'ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b756df7f-4791-45aa-9448-72109b2f971a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8f70305-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:18:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479141, 'reachable_time': 26435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224885, 'error': None, 'target': 'ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:14.953 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[563402b0-3a53-4afa-806d-b1d084ca54ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.026 182627 DEBUG nova.compute.manager [req-b7669003-174c-4dee-8a4a-4bf586ffefdf req-22cd3ef1-9a3a-4d1b-b208-12331566635a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.027 182627 DEBUG oslo_concurrency.lockutils [req-b7669003-174c-4dee-8a4a-4bf586ffefdf req-22cd3ef1-9a3a-4d1b-b208-12331566635a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.027 182627 DEBUG oslo_concurrency.lockutils [req-b7669003-174c-4dee-8a4a-4bf586ffefdf req-22cd3ef1-9a3a-4d1b-b208-12331566635a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.027 182627 DEBUG oslo_concurrency.lockutils [req-b7669003-174c-4dee-8a4a-4bf586ffefdf req-22cd3ef1-9a3a-4d1b-b208-12331566635a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.027 182627 DEBUG nova.compute.manager [req-b7669003-174c-4dee-8a4a-4bf586ffefdf req-22cd3ef1-9a3a-4d1b-b208-12331566635a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Processing event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.044 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5f727f-0936-47bb-9b4f-13702915b781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.046 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8f70305-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.047 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.048 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8f70305-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:15 np0005592767 NetworkManager[54973]: <info>  [1769121195.0505] manager: (tapb8f70305-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 22 17:33:15 np0005592767 kernel: tapb8f70305-a0: entered promiscuous mode
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.049 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.053 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8f70305-a0, col_values=(('external_ids', {'iface-id': '645c7c33-10f5-4a64-8c00-bd132f59ab7e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:15Z|00368|binding|INFO|Releasing lport 645c7c33-10f5-4a64-8c00-bd132f59ab7e from this chassis (sb_readonly=0)
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.065 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8f70305-a2df-4443-b7a1-0b4881c0c517.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8f70305-a2df-4443-b7a1-0b4881c0c517.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.065 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.066 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[274a9586-736a-4e89-98a6-61bed01656e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.067 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-b8f70305-a2df-4443-b7a1-0b4881c0c517
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/b8f70305-a2df-4443-b7a1-0b4881c0c517.pid.haproxy
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID b8f70305-a2df-4443-b7a1-0b4881c0c517
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:33:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:15.069 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'env', 'PROCESS_TAG=haproxy-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8f70305-a2df-4443-b7a1-0b4881c0c517.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.225 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121195.2247722, b7c14ca2-f444-48b4-b9e7-744e2cd63c78 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.226 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] VM Started (Lifecycle Event)#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.236 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.251 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.255 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.259 182627 INFO nova.virt.libvirt.driver [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Instance spawned successfully.#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.259 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.262 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.287 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.288 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.288 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.289 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.290 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.291 182627 DEBUG nova.virt.libvirt.driver [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.295 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.295 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121195.2249188, b7c14ca2-f444-48b4-b9e7-744e2cd63c78 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.296 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.345 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.349 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121195.2466505, b7c14ca2-f444-48b4-b9e7-744e2cd63c78 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.350 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.373 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.377 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.403 182627 INFO nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Took 5.73 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.404 182627 DEBUG nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.405 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.488 182627 INFO nova.compute.manager [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Took 6.22 seconds to build instance.#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.518 182627 DEBUG oslo_concurrency.lockutils [None req-2b371729-4dc0-4b3c-9a3e-24c0d15e7527 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:15 np0005592767 podman[224924]: 2026-01-22 22:33:15.53369341 +0000 UTC m=+0.082325379 container create 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:33:15 np0005592767 podman[224924]: 2026-01-22 22:33:15.487521745 +0000 UTC m=+0.036153714 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:33:15 np0005592767 systemd[1]: Started libpod-conmon-85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e.scope.
Jan 22 17:33:15 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:33:15 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f14ece7cb96f91583d8bd2c19c4faf165199408aceaa9f2750b44afdec2af69a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:33:15 np0005592767 podman[224924]: 2026-01-22 22:33:15.633863815 +0000 UTC m=+0.182495764 container init 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:33:15 np0005592767 podman[224924]: 2026-01-22 22:33:15.640306997 +0000 UTC m=+0.188938936 container start 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:33:15 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [NOTICE]   (224944) : New worker (224946) forked
Jan 22 17:33:15 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [NOTICE]   (224944) : Loading success.
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.727 182627 DEBUG nova.network.neutron [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Updated VIF entry in instance network info cache for port 6c400807-156f-43fe-8e7b-816d56d58dae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.728 182627 DEBUG nova.network.neutron [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Updating instance_info_cache with network_info: [{"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.748 182627 DEBUG oslo_concurrency.lockutils [req-85aa69e5-af41-4e2d-9530-4089bdbe28ef req-d11ba1f0-5b0f-4f34-80a6-bd5f422d9cf6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b7c14ca2-f444-48b4-b9e7-744e2cd63c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:15 np0005592767 nova_compute[182623]: 2026-01-22 22:33:15.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.315 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.562 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.562 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.563 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.564 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.564 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.581 182627 INFO nova.compute.manager [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Terminating instance#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.591 182627 DEBUG nova.compute.manager [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:33:16 np0005592767 kernel: tap6c400807-15 (unregistering): left promiscuous mode
Jan 22 17:33:16 np0005592767 NetworkManager[54973]: <info>  [1769121196.6188] device (tap6c400807-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.628 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:16Z|00369|binding|INFO|Releasing lport 6c400807-156f-43fe-8e7b-816d56d58dae from this chassis (sb_readonly=0)
Jan 22 17:33:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:16Z|00370|binding|INFO|Setting lport 6c400807-156f-43fe-8e7b-816d56d58dae down in Southbound
Jan 22 17:33:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:16Z|00371|binding|INFO|Removing iface tap6c400807-15 ovn-installed in OVS
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.632 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:16.638 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:a9:e3 10.100.0.3'], port_security=['fa:16:3e:8c:a9:e3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b7c14ca2-f444-48b4-b9e7-744e2cd63c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e2f582b87f84b00b5742f191c2b04af', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09ab7e27-7ca6-490b-a9df-74d647171fe1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=138e67dc-cbe9-4444-a91b-e1bade9c871a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6c400807-156f-43fe-8e7b-816d56d58dae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:16.639 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6c400807-156f-43fe-8e7b-816d56d58dae in datapath b8f70305-a2df-4443-b7a1-0b4881c0c517 unbound from our chassis#033[00m
Jan 22 17:33:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:16.640 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8f70305-a2df-4443-b7a1-0b4881c0c517, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:33:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:16.641 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d58d816d-42ad-4db5-ab0c-ff9ff3d12245]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:16.642 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517 namespace which is not needed anymore#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.664 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000062.scope: Deactivated successfully.
Jan 22 17:33:16 np0005592767 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000062.scope: Consumed 1.886s CPU time.
Jan 22 17:33:16 np0005592767 systemd-machined[153912]: Machine qemu-47-instance-00000062 terminated.
Jan 22 17:33:16 np0005592767 NetworkManager[54973]: <info>  [1769121196.8193] manager: (tap6c400807-15): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Jan 22 17:33:16 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [NOTICE]   (224944) : haproxy version is 2.8.14-c23fe91
Jan 22 17:33:16 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [NOTICE]   (224944) : path to executable is /usr/sbin/haproxy
Jan 22 17:33:16 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [WARNING]  (224944) : Exiting Master process...
Jan 22 17:33:16 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [ALERT]    (224944) : Current worker (224946) exited with code 143 (Terminated)
Jan 22 17:33:16 np0005592767 neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517[224940]: [WARNING]  (224944) : All workers exited. Exiting... (0)
Jan 22 17:33:16 np0005592767 systemd[1]: libpod-85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e.scope: Deactivated successfully.
Jan 22 17:33:16 np0005592767 podman[224976]: 2026-01-22 22:33:16.842544455 +0000 UTC m=+0.062429457 container died 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.886 182627 INFO nova.virt.libvirt.driver [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Instance destroyed successfully.#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.888 182627 DEBUG nova.objects.instance [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lazy-loading 'resources' on Instance uuid b7c14ca2-f444-48b4-b9e7-744e2cd63c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:16 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e-userdata-shm.mount: Deactivated successfully.
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.897 182627 DEBUG nova.virt.libvirt.vif [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-281089716',display_name='tempest-ServerAddressesTestJSON-server-281089716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-281089716',id=98,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:33:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0e2f582b87f84b00b5742f191c2b04af',ramdisk_id='',reservation_id='r-8uknps3w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1452571470',owner_user_name='tempest-ServerAddressesTestJSON-1452571470-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:33:15Z,user_data=None,user_id='9cae4b8c69234f14817ee89dfe745644',uuid=b7c14ca2-f444-48b4-b9e7-744e2cd63c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.897 182627 DEBUG nova.network.os_vif_util [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converting VIF {"id": "6c400807-156f-43fe-8e7b-816d56d58dae", "address": "fa:16:3e:8c:a9:e3", "network": {"id": "b8f70305-a2df-4443-b7a1-0b4881c0c517", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1030815999-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2f582b87f84b00b5742f191c2b04af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c400807-15", "ovs_interfaceid": "6c400807-156f-43fe-8e7b-816d56d58dae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.898 182627 DEBUG nova.network.os_vif_util [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.898 182627 DEBUG os_vif [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.900 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.900 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c400807-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.902 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 systemd[1]: var-lib-containers-storage-overlay-f14ece7cb96f91583d8bd2c19c4faf165199408aceaa9f2750b44afdec2af69a-merged.mount: Deactivated successfully.
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.903 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.907 182627 INFO os_vif [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:a9:e3,bridge_name='br-int',has_traffic_filtering=True,id=6c400807-156f-43fe-8e7b-816d56d58dae,network=Network(b8f70305-a2df-4443-b7a1-0b4881c0c517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c400807-15')#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.907 182627 INFO nova.virt.libvirt.driver [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Deleting instance files /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78_del#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.908 182627 INFO nova.virt.libvirt.driver [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Deletion of /var/lib/nova/instances/b7c14ca2-f444-48b4-b9e7-744e2cd63c78_del complete#033[00m
Jan 22 17:33:16 np0005592767 podman[224976]: 2026-01-22 22:33:16.913889424 +0000 UTC m=+0.133774386 container cleanup 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:33:16 np0005592767 systemd[1]: libpod-conmon-85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e.scope: Deactivated successfully.
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.989 182627 INFO nova.compute.manager [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.990 182627 DEBUG oslo.service.loopingcall [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.990 182627 DEBUG nova.compute.manager [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:33:16 np0005592767 nova_compute[182623]: 2026-01-22 22:33:16.991 182627 DEBUG nova.network.neutron [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:33:17 np0005592767 podman[225019]: 2026-01-22 22:33:17.000965588 +0000 UTC m=+0.054906035 container remove 85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.010 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75612f70-be70-4142-a9a3-0844cea66309]: (4, ('Thu Jan 22 10:33:16 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517 (85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e)\n85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e\nThu Jan 22 10:33:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517 (85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e)\n85b659ee75b60b5bfa2e58587312c2d988d37c8bf1ec9f725fa560ea0a7c2a1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.013 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[02d34a9e-08b1-4bd1-8f3a-a2beba922776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.014 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8f70305-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:17 np0005592767 kernel: tapb8f70305-a0: left promiscuous mode
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.040 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[daf16555-946d-4b14-9ab1-b84d3a6fcb5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.058 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7b6c4d-a423-4453-96e3-5c3283d8baee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[50529621-b6f7-44a7-ba39-c9bde4495c4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.080 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[424e1e2c-e1cb-4062-9673-a33f29935bcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479132, 'reachable_time': 31450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225034, 'error': None, 'target': 'ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.084 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8f70305-a2df-4443-b7a1-0b4881c0c517 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:33:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:17.084 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d637ef0a-04fe-41bc-8291-32b11fc3c20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:17 np0005592767 systemd[1]: run-netns-ovnmeta\x2db8f70305\x2da2df\x2d4443\x2db7a1\x2d0b4881c0c517.mount: Deactivated successfully.
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.141 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.141 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.141 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.142 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.142 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] No waiting events found dispatching network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.142 182627 WARNING nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received unexpected event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.142 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-unplugged-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.142 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.143 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.143 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.143 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] No waiting events found dispatching network-vif-unplugged-6c400807-156f-43fe-8e7b-816d56d58dae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.143 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-unplugged-6c400807-156f-43fe-8e7b-816d56d58dae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.143 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.144 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.144 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.144 182627 DEBUG oslo_concurrency.lockutils [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.144 182627 DEBUG nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] No waiting events found dispatching network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.144 182627 WARNING nova.compute.manager [req-658c3524-5c79-4287-bc8f-6c7acc985d27 req-5ce67efb-acec-4727-b06d-d66dabbbe2d9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received unexpected event network-vif-plugged-6c400807-156f-43fe-8e7b-816d56d58dae for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.871 182627 DEBUG nova.network.neutron [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:17 np0005592767 nova_compute[182623]: 2026-01-22 22:33:17.898 182627 INFO nova.compute.manager [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.001 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.002 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.070 182627 DEBUG nova.compute.provider_tree [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.092 182627 DEBUG nova.scheduler.client.report [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.110 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.170 182627 INFO nova.scheduler.client.report [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Deleted allocations for instance b7c14ca2-f444-48b4-b9e7-744e2cd63c78#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.243 182627 DEBUG oslo_concurrency.lockutils [None req-7ddff434-236c-4ab8-8de9-36c07ba0b41b 9cae4b8c69234f14817ee89dfe745644 0e2f582b87f84b00b5742f191c2b04af - - default default] Lock "b7c14ca2-f444-48b4-b9e7-744e2cd63c78" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.500 182627 DEBUG nova.compute.manager [req-ff4cf7ad-bbb1-4d98-8a3a-87cdd62b61c7 req-43f30612-5df8-401e-ba4b-d894f0423bdf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Received event network-vif-deleted-6c400807-156f-43fe-8e7b-816d56d58dae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:18 np0005592767 nova_compute[182623]: 2026-01-22 22:33:18.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:20 np0005592767 podman[225038]: 2026-01-22 22:33:20.601163737 +0000 UTC m=+0.089406281 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public)
Jan 22 17:33:20 np0005592767 podman[225037]: 2026-01-22 22:33:20.642529767 +0000 UTC m=+0.129835414 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.939 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.940 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.940 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:20 np0005592767 nova_compute[182623]: 2026-01-22 22:33:20.940 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.316 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.902 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.961 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.962 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.962 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:21 np0005592767 nova_compute[182623]: 2026-01-22 22:33:21.963 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.148 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.150 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5672MB free_disk=73.23095321655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.150 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.151 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.160 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121187.159396, bf6e2992-6fe6-4c4c-ad16-a5342029c966 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.160 182627 INFO nova.compute.manager [-] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.182 182627 DEBUG nova.compute.manager [None req-8dbf8dc5-50df-4f54-b77f-f3c1389e5457 - - - - - -] [instance: bf6e2992-6fe6-4c4c-ad16-a5342029c966] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.212 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.212 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.243 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.258 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.288 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.288 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:22 np0005592767 nova_compute[182623]: 2026-01-22 22:33:22.959 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:25 np0005592767 podman[225088]: 2026-01-22 22:33:25.128154331 +0000 UTC m=+0.044982244 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:33:25 np0005592767 podman[225089]: 2026-01-22 22:33:25.166888557 +0000 UTC m=+0.070321921 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:33:25 np0005592767 nova_compute[182623]: 2026-01-22 22:33:25.290 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:26 np0005592767 nova_compute[182623]: 2026-01-22 22:33:26.319 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:26 np0005592767 nova_compute[182623]: 2026-01-22 22:33:26.905 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.068 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.069 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.086 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.183 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.183 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.188 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.189 182627 INFO nova.compute.claims [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.322 182627 DEBUG nova.compute.provider_tree [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.352 182627 DEBUG nova.scheduler.client.report [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.369 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.370 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.425 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.426 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.445 182627 INFO nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.459 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.567 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.568 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.568 182627 INFO nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Creating image(s)#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.569 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.569 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.570 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.581 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.634 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.635 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.636 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.646 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.693 182627 DEBUG nova.policy [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d6cbf2c31d34db0a5b2c5465f83dd85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '070241059b4c45da8be359eb0123c835', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.697 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.698 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.731 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.733 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.734 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.825 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.827 182627 DEBUG nova.virt.disk.api [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Checking if we can resize image /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.828 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.900 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.900 182627 DEBUG nova.virt.disk.api [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Cannot resize image /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.901 182627 DEBUG nova.objects.instance [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lazy-loading 'migration_context' on Instance uuid 444d90a7-b970-474d-8e18-eaab83f057d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.914 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.914 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Ensure instance console log exists: /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.915 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.915 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:28 np0005592767 nova_compute[182623]: 2026-01-22 22:33:28.915 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:29 np0005592767 nova_compute[182623]: 2026-01-22 22:33:29.445 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Successfully created port: 7958633a-5f31-4f72-adc5-ea7cdaa817ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.309 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Successfully updated port: 7958633a-5f31-4f72-adc5-ea7cdaa817ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.327 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.327 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquired lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.327 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.437 182627 DEBUG nova.compute.manager [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-changed-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.437 182627 DEBUG nova.compute.manager [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Refreshing instance network info cache due to event network-changed-7958633a-5f31-4f72-adc5-ea7cdaa817ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.438 182627 DEBUG oslo_concurrency.lockutils [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:30 np0005592767 nova_compute[182623]: 2026-01-22 22:33:30.515 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.322 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.458 182627 DEBUG nova.network.neutron [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Updating instance_info_cache with network_info: [{"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.509 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Releasing lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.510 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Instance network_info: |[{"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.511 182627 DEBUG oslo_concurrency.lockutils [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.511 182627 DEBUG nova.network.neutron [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Refreshing network info cache for port 7958633a-5f31-4f72-adc5-ea7cdaa817ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.516 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Start _get_guest_xml network_info=[{"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.522 182627 WARNING nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.529 182627 DEBUG nova.virt.libvirt.host [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.530 182627 DEBUG nova.virt.libvirt.host [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.546 182627 DEBUG nova.virt.libvirt.host [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.547 182627 DEBUG nova.virt.libvirt.host [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.549 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.549 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.550 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.550 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.550 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.551 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.551 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.551 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.552 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.552 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.552 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.552 182627 DEBUG nova.virt.hardware [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.557 182627 DEBUG nova.virt.libvirt.vif [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-956648261',display_name='tempest-ServerMetadataTestJSON-server-956648261',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-956648261',id=100,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='070241059b4c45da8be359eb0123c835',ramdisk_id='',reservation_id='r-0wi96l4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-894066612',owner_user_name='tempest-ServerMetadataTestJSON-894066612-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:28Z,user_data=None,user_id='0d6cbf2c31d34db0a5b2c5465f83dd85',uuid=444d90a7-b970-474d-8e18-eaab83f057d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.558 182627 DEBUG nova.network.os_vif_util [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converting VIF {"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.558 182627 DEBUG nova.network.os_vif_util [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.560 182627 DEBUG nova.objects.instance [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lazy-loading 'pci_devices' on Instance uuid 444d90a7-b970-474d-8e18-eaab83f057d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.577 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <uuid>444d90a7-b970-474d-8e18-eaab83f057d5</uuid>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <name>instance-00000064</name>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerMetadataTestJSON-server-956648261</nova:name>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:33:31</nova:creationTime>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:user uuid="0d6cbf2c31d34db0a5b2c5465f83dd85">tempest-ServerMetadataTestJSON-894066612-project-member</nova:user>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:project uuid="070241059b4c45da8be359eb0123c835">tempest-ServerMetadataTestJSON-894066612</nova:project>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        <nova:port uuid="7958633a-5f31-4f72-adc5-ea7cdaa817ec">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="serial">444d90a7-b970-474d-8e18-eaab83f057d5</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="uuid">444d90a7-b970-474d-8e18-eaab83f057d5</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.config"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:be:bc:24"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <target dev="tap7958633a-5f"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/console.log" append="off"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:33:31 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:33:31 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:33:31 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:33:31 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.579 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Preparing to wait for external event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.579 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.579 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.579 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.580 182627 DEBUG nova.virt.libvirt.vif [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-956648261',display_name='tempest-ServerMetadataTestJSON-server-956648261',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-956648261',id=100,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='070241059b4c45da8be359eb0123c835',ramdisk_id='',reservation_id='r-0wi96l4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-894066612',owner_user_name='tempest-ServerMetadataTestJSON-894066612-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:28Z,user_data=None,user_id='0d6cbf2c31d34db0a5b2c5465f83dd85',uuid=444d90a7-b970-474d-8e18-eaab83f057d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.580 182627 DEBUG nova.network.os_vif_util [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converting VIF {"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.581 182627 DEBUG nova.network.os_vif_util [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.581 182627 DEBUG os_vif [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.582 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.582 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.582 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.585 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.585 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7958633a-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.586 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7958633a-5f, col_values=(('external_ids', {'iface-id': '7958633a-5f31-4f72-adc5-ea7cdaa817ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:bc:24', 'vm-uuid': '444d90a7-b970-474d-8e18-eaab83f057d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.587 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:31 np0005592767 NetworkManager[54973]: <info>  [1769121211.5887] manager: (tap7958633a-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.589 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.594 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.595 182627 INFO os_vif [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f')#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.645 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.645 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.645 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] No VIF found with MAC fa:16:3e:be:bc:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.646 182627 INFO nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Using config drive#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.881 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121196.8798115, b7c14ca2-f444-48b4-b9e7-744e2cd63c78 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.881 182627 INFO nova.compute.manager [-] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:33:31 np0005592767 nova_compute[182623]: 2026-01-22 22:33:31.902 182627 DEBUG nova.compute.manager [None req-12f1fe11-5386-468a-9d91-a4a062d0d849 - - - - - -] [instance: b7c14ca2-f444-48b4-b9e7-744e2cd63c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.016 182627 INFO nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Creating config drive at /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.config#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.024 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfthvqvo5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.153 182627 DEBUG oslo_concurrency.processutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfthvqvo5" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:33 np0005592767 podman[225149]: 2026-01-22 22:33:33.157932376 +0000 UTC m=+0.063245321 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:33:33 np0005592767 kernel: tap7958633a-5f: entered promiscuous mode
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.2173] manager: (tap7958633a-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Jan 22 17:33:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:33Z|00372|binding|INFO|Claiming lport 7958633a-5f31-4f72-adc5-ea7cdaa817ec for this chassis.
Jan 22 17:33:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:33Z|00373|binding|INFO|7958633a-5f31-4f72-adc5-ea7cdaa817ec: Claiming fa:16:3e:be:bc:24 10.100.0.12
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.219 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.222 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.238 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:bc:24 10.100.0.12'], port_security=['fa:16:3e:be:bc:24 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '444d90a7-b970-474d-8e18-eaab83f057d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '070241059b4c45da8be359eb0123c835', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0d68d84a-a893-4fa6-95d3-0f6a74fabbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=37daf1d8-8950-4d32-9417-c12eb73acd75, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7958633a-5f31-4f72-adc5-ea7cdaa817ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.239 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7958633a-5f31-4f72-adc5-ea7cdaa817ec in datapath aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 bound to our chassis#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.240 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aef6bd13-04c4-41b0-9c55-cd87df6d5ff2#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.251 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb167029-a077-45e4-8717-2880fed7c96e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.252 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaef6bd13-01 in ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:33:33 np0005592767 systemd-udevd[225187]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.254 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaef6bd13-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.255 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba1d8fc-09ef-4497-bc3f-f32b3f84d77a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.255 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbb709b-7f66-4b26-a10f-d445cbc28838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.3187] device (tap7958633a-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.3198] device (tap7958633a-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.319 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d58ea019-11e7-4a6c-9009-068669280226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.321 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 systemd-machined[153912]: New machine qemu-48-instance-00000064.
Jan 22 17:33:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:33Z|00374|binding|INFO|Setting lport 7958633a-5f31-4f72-adc5-ea7cdaa817ec ovn-installed in OVS
Jan 22 17:33:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:33Z|00375|binding|INFO|Setting lport 7958633a-5f31-4f72-adc5-ea7cdaa817ec up in Southbound
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.335 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb9f1fb-3a14-44b3-a93d-bb1c9c9a95b1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 systemd[1]: Started Virtual Machine qemu-48-instance-00000064.
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.362 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[61048b93-5710-4a2e-8160-318e691651d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.3684] manager: (tapaef6bd13-00): new Veth device (/org/freedesktop/NetworkManager/Devices/180)
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.369 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[68b83ddf-999e-4f72-9e22-248f0ede79dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 systemd-udevd[225192]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.396 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[bba10ee4-5bd0-4dd8-8503-29a8fea10c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.399 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4eecb975-1287-4423-9288-0656db4e2df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.4299] device (tapaef6bd13-00): carrier: link connected
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.441 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc01d45-012b-4cba-95a3-aca55ba048a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.468 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[94f1cf95-643d-4121-a02e-e2de3709c4cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef6bd13-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:cc:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481000, 'reachable_time': 38466, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225221, 'error': None, 'target': 'ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.495 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[175a417a-57cc-41cf-95be-1d7600e94a23]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:cc62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 481000, 'tstamp': 481000}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225222, 'error': None, 'target': 'ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.529 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b0404d14-b049-4072-bdba-795e2ad375b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef6bd13-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:cc:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481000, 'reachable_time': 38466, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225223, 'error': None, 'target': 'ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.581 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8dbacd30-c0e8-4de5-8c45-07e780c848a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.657 182627 DEBUG nova.network.neutron [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Updated VIF entry in instance network info cache for port 7958633a-5f31-4f72-adc5-ea7cdaa817ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.658 182627 DEBUG nova.network.neutron [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Updating instance_info_cache with network_info: [{"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.671 182627 DEBUG nova.compute.manager [req-9ac99c39-6d3f-4f27-9e1d-33216056d933 req-fe126a0d-9ac9-4543-b790-9172751a28df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.672 182627 DEBUG oslo_concurrency.lockutils [req-9ac99c39-6d3f-4f27-9e1d-33216056d933 req-fe126a0d-9ac9-4543-b790-9172751a28df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.672 182627 DEBUG oslo_concurrency.lockutils [req-9ac99c39-6d3f-4f27-9e1d-33216056d933 req-fe126a0d-9ac9-4543-b790-9172751a28df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.672 182627 DEBUG oslo_concurrency.lockutils [req-9ac99c39-6d3f-4f27-9e1d-33216056d933 req-fe126a0d-9ac9-4543-b790-9172751a28df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.673 182627 DEBUG nova.compute.manager [req-9ac99c39-6d3f-4f27-9e1d-33216056d933 req-fe126a0d-9ac9-4543-b790-9172751a28df 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Processing event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.675 182627 DEBUG oslo_concurrency.lockutils [req-21fddab4-9ad9-4f9d-a3f3-f02138cba5a4 req-bf54ad4e-8bc1-482d-8e16-151da9f962f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-444d90a7-b970-474d-8e18-eaab83f057d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.690 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc46ad0d-1b98-4957-95c3-05771fd91773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.691 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef6bd13-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.692 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.693 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaef6bd13-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.695 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 NetworkManager[54973]: <info>  [1769121213.6971] manager: (tapaef6bd13-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Jan 22 17:33:33 np0005592767 kernel: tapaef6bd13-00: entered promiscuous mode
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.702 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.704 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaef6bd13-00, col_values=(('external_ids', {'iface-id': '285532e0-8bc1-484c-be28-d83cf91bc454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.705 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:33Z|00376|binding|INFO|Releasing lport 285532e0-8bc1-484c-be28-d83cf91bc454 from this chassis (sb_readonly=0)
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.728 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 nova_compute[182623]: 2026-01-22 22:33:33.729 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.730 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aef6bd13-04c4-41b0-9c55-cd87df6d5ff2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aef6bd13-04c4-41b0-9c55-cd87df6d5ff2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.731 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed6a6a0-78ed-4d4c-8ef9-0d85b5169b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.732 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/aef6bd13-04c4-41b0-9c55-cd87df6d5ff2.pid.haproxy
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID aef6bd13-04c4-41b0-9c55-cd87df6d5ff2
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:33:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:33.735 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'env', 'PROCESS_TAG=haproxy-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aef6bd13-04c4-41b0-9c55-cd87df6d5ff2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.082 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121214.0815918, 444d90a7-b970-474d-8e18-eaab83f057d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.082 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] VM Started (Lifecycle Event)#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.085 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.090 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.096 182627 INFO nova.virt.libvirt.driver [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Instance spawned successfully.#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.097 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.108 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:34 np0005592767 podman[225262]: 2026-01-22 22:33:34.110879829 +0000 UTC m=+0.053642599 container create fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.114 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.127 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.128 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.129 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.129 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.130 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.130 182627 DEBUG nova.virt.libvirt.driver [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:34 np0005592767 systemd[1]: Started libpod-conmon-fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb.scope.
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.166 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.168 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121214.0821452, 444d90a7-b970-474d-8e18-eaab83f057d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.168 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:33:34 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:33:34 np0005592767 podman[225262]: 2026-01-22 22:33:34.081888079 +0000 UTC m=+0.024650879 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:33:34 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/514ac99701f21864ec65bf8f8a7239fbeb9a6f232cc65bb3bf98e0928b5a95e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:33:34 np0005592767 podman[225262]: 2026-01-22 22:33:34.194975178 +0000 UTC m=+0.137737978 container init fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.197 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.201 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121214.0883536, 444d90a7-b970-474d-8e18-eaab83f057d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.202 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:33:34 np0005592767 podman[225262]: 2026-01-22 22:33:34.202830491 +0000 UTC m=+0.145593271 container start fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.224 182627 INFO nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Took 5.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.225 182627 DEBUG nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.228 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:34 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [NOTICE]   (225281) : New worker (225283) forked
Jan 22 17:33:34 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [NOTICE]   (225281) : Loading success.
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.237 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.293 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.354 182627 INFO nova.compute.manager [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Took 6.21 seconds to build instance.#033[00m
Jan 22 17:33:34 np0005592767 nova_compute[182623]: 2026-01-22 22:33:34.372 182627 DEBUG oslo_concurrency.lockutils [None req-ebfd24d8-a6cb-4030-858c-8b80f66a6de2 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.774 182627 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.775 182627 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.776 182627 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.776 182627 DEBUG oslo_concurrency.lockutils [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.777 182627 DEBUG nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] No waiting events found dispatching network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:35 np0005592767 nova_compute[182623]: 2026-01-22 22:33:35.777 182627 WARNING nova.compute.manager [req-3fee5623-6d49-4328-8252-28972028e125 req-83aa6c16-4d0a-4049-9362-6fd2194d65fb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received unexpected event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec for instance with vm_state active and task_state None.#033[00m
Jan 22 17:33:36 np0005592767 nova_compute[182623]: 2026-01-22 22:33:36.326 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:36 np0005592767 nova_compute[182623]: 2026-01-22 22:33:36.588 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.448 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.449 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.449 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.450 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.450 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.463 182627 INFO nova.compute.manager [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Terminating instance#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.473 182627 DEBUG nova.compute.manager [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:33:40 np0005592767 kernel: tap7958633a-5f (unregistering): left promiscuous mode
Jan 22 17:33:40 np0005592767 NetworkManager[54973]: <info>  [1769121220.4972] device (tap7958633a-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:33:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:40Z|00377|binding|INFO|Releasing lport 7958633a-5f31-4f72-adc5-ea7cdaa817ec from this chassis (sb_readonly=0)
Jan 22 17:33:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:40Z|00378|binding|INFO|Setting lport 7958633a-5f31-4f72-adc5-ea7cdaa817ec down in Southbound
Jan 22 17:33:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:40Z|00379|binding|INFO|Removing iface tap7958633a-5f ovn-installed in OVS
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.538 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.546 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:bc:24 10.100.0.12'], port_security=['fa:16:3e:be:bc:24 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '444d90a7-b970-474d-8e18-eaab83f057d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '070241059b4c45da8be359eb0123c835', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0d68d84a-a893-4fa6-95d3-0f6a74fabbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=37daf1d8-8950-4d32-9417-c12eb73acd75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=7958633a-5f31-4f72-adc5-ea7cdaa817ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.548 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 7958633a-5f31-4f72-adc5-ea7cdaa817ec in datapath aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 unbound from our chassis#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.550 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.552 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.552 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[29bd69ed-a588-45c1-80d7-e61e13f610d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.554 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 namespace which is not needed anymore#033[00m
Jan 22 17:33:40 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:33:40 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:33:40 np0005592767 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Deactivated successfully.
Jan 22 17:33:40 np0005592767 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Consumed 7.234s CPU time.
Jan 22 17:33:40 np0005592767 systemd-machined[153912]: Machine qemu-48-instance-00000064 terminated.
Jan 22 17:33:40 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [NOTICE]   (225281) : haproxy version is 2.8.14-c23fe91
Jan 22 17:33:40 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [NOTICE]   (225281) : path to executable is /usr/sbin/haproxy
Jan 22 17:33:40 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [WARNING]  (225281) : Exiting Master process...
Jan 22 17:33:40 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [ALERT]    (225281) : Current worker (225283) exited with code 143 (Terminated)
Jan 22 17:33:40 np0005592767 neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2[225277]: [WARNING]  (225281) : All workers exited. Exiting... (0)
Jan 22 17:33:40 np0005592767 systemd[1]: libpod-fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb.scope: Deactivated successfully.
Jan 22 17:33:40 np0005592767 conmon[225277]: conmon fd6ef9059e74be6fefde <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb.scope/container/memory.events
Jan 22 17:33:40 np0005592767 podman[225318]: 2026-01-22 22:33:40.701770061 +0000 UTC m=+0.051074596 container died fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.703 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.708 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb-userdata-shm.mount: Deactivated successfully.
Jan 22 17:33:40 np0005592767 systemd[1]: var-lib-containers-storage-overlay-514ac99701f21864ec65bf8f8a7239fbeb9a6f232cc65bb3bf98e0928b5a95e5-merged.mount: Deactivated successfully.
Jan 22 17:33:40 np0005592767 podman[225318]: 2026-01-22 22:33:40.739345994 +0000 UTC m=+0.088650529 container cleanup fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.745 182627 INFO nova.virt.libvirt.driver [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Instance destroyed successfully.#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.746 182627 DEBUG nova.objects.instance [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lazy-loading 'resources' on Instance uuid 444d90a7-b970-474d-8e18-eaab83f057d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:40 np0005592767 systemd[1]: libpod-conmon-fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb.scope: Deactivated successfully.
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.774 182627 DEBUG nova.virt.libvirt.vif [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-956648261',display_name='tempest-ServerMetadataTestJSON-server-956648261',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-956648261',id=100,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:33:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='070241059b4c45da8be359eb0123c835',ramdisk_id='',reservation_id='r-0wi96l4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-894066612',owner_user_name='tempest-ServerMetadataTestJSON-894066612-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:33:40Z,user_data=None,user_id='0d6cbf2c31d34db0a5b2c5465f83dd85',uuid=444d90a7-b970-474d-8e18-eaab83f057d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.775 182627 DEBUG nova.network.os_vif_util [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converting VIF {"id": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "address": "fa:16:3e:be:bc:24", "network": {"id": "aef6bd13-04c4-41b0-9c55-cd87df6d5ff2", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1186394645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "070241059b4c45da8be359eb0123c835", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7958633a-5f", "ovs_interfaceid": "7958633a-5f31-4f72-adc5-ea7cdaa817ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.776 182627 DEBUG nova.network.os_vif_util [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.776 182627 DEBUG os_vif [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.778 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.778 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7958633a-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.785 182627 INFO os_vif [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:bc:24,bridge_name='br-int',has_traffic_filtering=True,id=7958633a-5f31-4f72-adc5-ea7cdaa817ec,network=Network(aef6bd13-04c4-41b0-9c55-cd87df6d5ff2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7958633a-5f')#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.785 182627 INFO nova.virt.libvirt.driver [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Deleting instance files /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5_del#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.786 182627 INFO nova.virt.libvirt.driver [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Deletion of /var/lib/nova/instances/444d90a7-b970-474d-8e18-eaab83f057d5_del complete#033[00m
Jan 22 17:33:40 np0005592767 podman[225365]: 2026-01-22 22:33:40.810916259 +0000 UTC m=+0.045232561 container remove fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.819 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7aada6c2-7cb1-49bf-988a-7b6ab4d115a4]: (4, ('Thu Jan 22 10:33:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 (fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb)\nfd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb\nThu Jan 22 10:33:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 (fd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb)\nfd6ef9059e74be6fefde351a1fa74dab113570836df2a15eb65a5f84ac606bcb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.820 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fd311014-7708-4a1d-ab3d-fbb6cb25e96d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.821 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef6bd13-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.823 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 kernel: tapaef6bd13-00: left promiscuous mode
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.835 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.837 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[685b1a36-03f6-4819-9a00-7dbe38aa3cea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.857 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7de6b057-1471-455f-9fab-2e47c2255859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.858 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8cc5a0-835e-4eaa-926a-66212490a29b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.868 182627 DEBUG nova.compute.manager [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-unplugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.869 182627 DEBUG oslo_concurrency.lockutils [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.869 182627 DEBUG oslo_concurrency.lockutils [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.870 182627 DEBUG oslo_concurrency.lockutils [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.870 182627 DEBUG nova.compute.manager [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] No waiting events found dispatching network-vif-unplugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.870 182627 DEBUG nova.compute.manager [req-872edcca-6f82-4e0e-b505-7e67135abaeb req-93b9aab9-8f5d-452d-a41f-ae7ca083a958 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-unplugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.880 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3a6413-622b-48f1-b597-c363cb578be3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480993, 'reachable_time': 29485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225383, 'error': None, 'target': 'ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.883 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aef6bd13-04c4-41b0-9c55-cd87df6d5ff2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:33:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:40.883 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[37006bbe-ab3f-43ea-9f38-320fb47fa183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:40 np0005592767 systemd[1]: run-netns-ovnmeta\x2daef6bd13\x2d04c4\x2d41b0\x2d9c55\x2dcd87df6d5ff2.mount: Deactivated successfully.
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.896 182627 INFO nova.compute.manager [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.897 182627 DEBUG oslo.service.loopingcall [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.897 182627 DEBUG nova.compute.manager [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:33:40 np0005592767 nova_compute[182623]: 2026-01-22 22:33:40.898 182627 DEBUG nova.network.neutron [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.102 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.103 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.120 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.240 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.241 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.254 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.254 182627 INFO nova.compute.claims [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.418 182627 DEBUG nova.compute.provider_tree [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.433 182627 DEBUG nova.scheduler.client.report [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.465 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.465 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.522 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.522 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.538 182627 INFO nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.579 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.617 182627 DEBUG nova.network.neutron [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.640 182627 INFO nova.compute.manager [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Took 0.74 seconds to deallocate network for instance.#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.712 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.714 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.715 182627 INFO nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Creating image(s)#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.716 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.717 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.719 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.751 182627 DEBUG nova.policy [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.754 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.755 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.756 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.849 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.852 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.853 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.880 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.913 182627 DEBUG nova.compute.provider_tree [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.952 182627 DEBUG nova.scheduler.client.report [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.971 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.972 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:41 np0005592767 nova_compute[182623]: 2026-01-22 22:33:41.993 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.009 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.009 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.009 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.026 182627 INFO nova.scheduler.client.report [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Deleted allocations for instance 444d90a7-b970-474d-8e18-eaab83f057d5#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.062 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.063 182627 DEBUG nova.virt.disk.api [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Checking if we can resize image /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.063 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.108 182627 DEBUG oslo_concurrency.lockutils [None req-681ad2c1-cf44-4aa8-b11e-2345405b788b 0d6cbf2c31d34db0a5b2c5465f83dd85 070241059b4c45da8be359eb0123c835 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.135 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.135 182627 DEBUG nova.virt.disk.api [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Cannot resize image /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.136 182627 DEBUG nova.objects.instance [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.149 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.150 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Ensure instance console log exists: /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.150 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.151 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.151 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:42 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.657 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Successfully created port: b4123d25-3dda-4aac-84ab-b88e969ec9a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:42.999 182627 DEBUG nova.compute.manager [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.000 182627 DEBUG oslo_concurrency.lockutils [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.001 182627 DEBUG oslo_concurrency.lockutils [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.001 182627 DEBUG oslo_concurrency.lockutils [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "444d90a7-b970-474d-8e18-eaab83f057d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.002 182627 DEBUG nova.compute.manager [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] No waiting events found dispatching network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.003 182627 WARNING nova.compute.manager [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received unexpected event network-vif-plugged-7958633a-5f31-4f72-adc5-ea7cdaa817ec for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.003 182627 DEBUG nova.compute.manager [req-f8e1e027-5143-46f5-8487-6d69e4a838fd req-3ded9651-e264-443a-a7c1-5e4ff145795f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Received event network-vif-deleted-7958633a-5f31-4f72-adc5-ea7cdaa817ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.218 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.218 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.235 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.325 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.326 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.334 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.334 182627 INFO nova.compute.claims [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.479 182627 DEBUG nova.compute.provider_tree [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.493 182627 DEBUG nova.scheduler.client.report [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.516 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.517 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.569 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.570 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.594 182627 INFO nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.600 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Successfully updated port: b4123d25-3dda-4aac-84ab-b88e969ec9a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.628 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.633 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.634 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquired lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.634 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.707 182627 DEBUG nova.compute.manager [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-changed-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.708 182627 DEBUG nova.compute.manager [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Refreshing instance network info cache due to event network-changed-b4123d25-3dda-4aac-84ab-b88e969ec9a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.708 182627 DEBUG oslo_concurrency.lockutils [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.773 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.774 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.775 182627 INFO nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Creating image(s)#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.775 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.776 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.776 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.789 182627 DEBUG nova.policy [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.792 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.847 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.848 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.848 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.859 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.934 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.935 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.973 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.974 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:43 np0005592767 nova_compute[182623]: 2026-01-22 22:33:43.974 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.037 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.039 182627 DEBUG nova.virt.disk.api [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Checking if we can resize image /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.039 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.098 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.099 182627 DEBUG nova.virt.disk.api [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Cannot resize image /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.100 182627 DEBUG nova.objects.instance [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'migration_context' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.117 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.118 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Ensure instance console log exists: /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.118 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.118 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.119 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:44 np0005592767 nova_compute[182623]: 2026-01-22 22:33:44.167 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:33:44 np0005592767 podman[225412]: 2026-01-22 22:33:44.211398077 +0000 UTC m=+0.118448341 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.156 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Successfully created port: 97b31fc4-0356-4ece-b72f-7e18f5cc4782 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.405 182627 DEBUG nova.network.neutron [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updating instance_info_cache with network_info: [{"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.448 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Releasing lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.449 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Instance network_info: |[{"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.450 182627 DEBUG oslo_concurrency.lockutils [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.451 182627 DEBUG nova.network.neutron [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Refreshing network info cache for port b4123d25-3dda-4aac-84ab-b88e969ec9a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.456 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Start _get_guest_xml network_info=[{"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.462 182627 WARNING nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.466 182627 DEBUG nova.virt.libvirt.host [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.468 182627 DEBUG nova.virt.libvirt.host [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.471 182627 DEBUG nova.virt.libvirt.host [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.472 182627 DEBUG nova.virt.libvirt.host [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.474 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.474 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.475 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.476 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.476 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.477 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.477 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.478 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.478 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.479 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.479 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.480 182627 DEBUG nova.virt.hardware [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.486 182627 DEBUG nova.virt.libvirt.vif [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2065365074',display_name='tempest-ServerRescueNegativeTestJSON-server-2065365074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2065365074',id=102,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-kow01kp3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:41Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=2cf209bf-cc1d-4f9f-953d-c73d3a446160,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.487 182627 DEBUG nova.network.os_vif_util [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.489 182627 DEBUG nova.network.os_vif_util [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.491 182627 DEBUG nova.objects.instance [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.512 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <uuid>2cf209bf-cc1d-4f9f-953d-c73d3a446160</uuid>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <name>instance-00000066</name>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-2065365074</nova:name>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:33:45</nova:creationTime>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:user uuid="ee8ea21df7c94181896a0576e091d6bf">tempest-ServerRescueNegativeTestJSON-554166911-project-member</nova:user>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:project uuid="793af464aeec424ead871e11355f94e3">tempest-ServerRescueNegativeTestJSON-554166911</nova:project>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        <nova:port uuid="b4123d25-3dda-4aac-84ab-b88e969ec9a7">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="serial">2cf209bf-cc1d-4f9f-953d-c73d3a446160</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="uuid">2cf209bf-cc1d-4f9f-953d-c73d3a446160</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.config"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:45:8b:7a"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <target dev="tapb4123d25-3d"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/console.log" append="off"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:33:45 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:33:45 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:33:45 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:33:45 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.515 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Preparing to wait for external event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.516 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.516 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.516 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.517 182627 DEBUG nova.virt.libvirt.vif [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2065365074',display_name='tempest-ServerRescueNegativeTestJSON-server-2065365074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2065365074',id=102,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-kow01kp3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:41Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=2cf209bf-cc1d-4f9f-953d-c73d3a446160,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.518 182627 DEBUG nova.network.os_vif_util [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.519 182627 DEBUG nova.network.os_vif_util [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.519 182627 DEBUG os_vif [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.520 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.521 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.522 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.526 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.526 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb4123d25-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.527 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb4123d25-3d, col_values=(('external_ids', {'iface-id': 'b4123d25-3dda-4aac-84ab-b88e969ec9a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:8b:7a', 'vm-uuid': '2cf209bf-cc1d-4f9f-953d-c73d3a446160'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:45 np0005592767 NetworkManager[54973]: <info>  [1769121225.5305] manager: (tapb4123d25-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.532 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.541 182627 INFO os_vif [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d')#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.606 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.607 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.607 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No VIF found with MAC fa:16:3e:45:8b:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:33:45 np0005592767 nova_compute[182623]: 2026-01-22 22:33:45.608 182627 INFO nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Using config drive#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.164 182627 INFO nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Creating config drive at /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.config#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.169 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpver9gh7d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.313 182627 DEBUG oslo_concurrency.processutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpver9gh7d" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.330 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 kernel: tapb4123d25-3d: entered promiscuous mode
Jan 22 17:33:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:46Z|00380|binding|INFO|Claiming lport b4123d25-3dda-4aac-84ab-b88e969ec9a7 for this chassis.
Jan 22 17:33:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:46Z|00381|binding|INFO|b4123d25-3dda-4aac-84ab-b88e969ec9a7: Claiming fa:16:3e:45:8b:7a 10.100.0.11
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.399 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.4005] manager: (tapb4123d25-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.407 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.418 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:8b:7a 10.100.0.11'], port_security=['fa:16:3e:45:8b:7a 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b4123d25-3dda-4aac-84ab-b88e969ec9a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.419 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b4123d25-3dda-4aac-84ab-b88e969ec9a7 in datapath e48ef497-1516-42f4-a190-f681841535fb bound to our chassis#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.421 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e48ef497-1516-42f4-a190-f681841535fb#033[00m
Jan 22 17:33:46 np0005592767 systemd-udevd[225458]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.435 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6a46a7c3-c76b-4c5a-8186-cee1be2f5c8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.436 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape48ef497-11 in ovnmeta-e48ef497-1516-42f4-a190-f681841535fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.438 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape48ef497-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.438 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d6662e30-1242-4382-923b-a25fc36405d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.439 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[54c5412d-8e25-4ea5-9d39-3ce713b406f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 systemd-machined[153912]: New machine qemu-49-instance-00000066.
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.4484] device (tapb4123d25-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.4490] device (tapb4123d25-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.452 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d94d72b9-aae5-47fe-8b48-874ea8765d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 systemd[1]: Started Virtual Machine qemu-49-instance-00000066.
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.489 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b90f44e9-6732-4eac-97e3-3e14482bed4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.492 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:46Z|00382|binding|INFO|Setting lport b4123d25-3dda-4aac-84ab-b88e969ec9a7 ovn-installed in OVS
Jan 22 17:33:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:46Z|00383|binding|INFO|Setting lport b4123d25-3dda-4aac-84ab-b88e969ec9a7 up in Southbound
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.496 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.522 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[00b53779-d04d-4106-9788-ebace4b8ab37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.527 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b800e05c-1067-4d33-93d7-8c3d593b0ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.5294] manager: (tape48ef497-10): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Jan 22 17:33:46 np0005592767 systemd-udevd[225462]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.562 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef84bef-918d-40b7-9e6b-1f6826508f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.567 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[698614d5-7e72-48c4-8cfe-8249d6c7f103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.5934] device (tape48ef497-10): carrier: link connected
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.602 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2374fca8-69d9-48a3-9c8e-eed37253ca5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.627 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb728c3b-f4c0-47e6-b885-6223af8b7d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225491, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.643 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc868bcc-a004-466e-a498-8fcfef486d5c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:e060'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482317, 'tstamp': 482317}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225497, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.667 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1e7ce9-f9f3-4c17-82bf-2ff016bbb585]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225499, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.705 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[86b63709-3dc2-4a5d-b36e-934cd7c7a873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.716 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121226.7157454, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.716 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Started (Lifecycle Event)#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.740 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.743 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121226.7189956, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.743 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.749 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb2aa46-133b-4150-92c8-8d9f35f129ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.776 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.776 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.776 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape48ef497-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.778 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 NetworkManager[54973]: <info>  [1769121226.7793] manager: (tape48ef497-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 22 17:33:46 np0005592767 kernel: tape48ef497-10: entered promiscuous mode
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.782 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape48ef497-10, col_values=(('external_ids', {'iface-id': 'a8d1d971-761f-423e-815b-1f3891a52bf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.784 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:46Z|00384|binding|INFO|Releasing lport a8d1d971-761f-423e-815b-1f3891a52bf3 from this chassis (sb_readonly=0)
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.785 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e48ef497-1516-42f4-a190-f681841535fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e48ef497-1516-42f4-a190-f681841535fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.786 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e141e45f-ee9e-4a96-82d6-b7d041edf797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.787 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-e48ef497-1516-42f4-a190-f681841535fb
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/e48ef497-1516-42f4-a190-f681841535fb.pid.haproxy
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID e48ef497-1516-42f4-a190-f681841535fb
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.787 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:46.787 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'env', 'PROCESS_TAG=haproxy-e48ef497-1516-42f4-a190-f681841535fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e48ef497-1516-42f4-a190-f681841535fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.795 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:46 np0005592767 nova_compute[182623]: 2026-01-22 22:33:46.806 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.003 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Successfully updated port: 97b31fc4-0356-4ece-b72f-7e18f5cc4782 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.020 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.020 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquired lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.020 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.105 182627 DEBUG nova.compute.manager [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-changed-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.105 182627 DEBUG nova.compute.manager [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Refreshing instance network info cache due to event network-changed-97b31fc4-0356-4ece-b72f-7e18f5cc4782. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.105 182627 DEBUG oslo_concurrency.lockutils [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:47 np0005592767 podman[225532]: 2026-01-22 22:33:47.164920329 +0000 UTC m=+0.051759075 container create bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.181 182627 DEBUG nova.network.neutron [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updated VIF entry in instance network info cache for port b4123d25-3dda-4aac-84ab-b88e969ec9a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.182 182627 DEBUG nova.network.neutron [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updating instance_info_cache with network_info: [{"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.195 182627 DEBUG oslo_concurrency.lockutils [req-77d0d851-2b11-439f-9e3c-f92bf07cc17e req-e6b6e5a6-9698-439e-acb9-8dfab09f8c14 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:47 np0005592767 systemd[1]: Started libpod-conmon-bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a.scope.
Jan 22 17:33:47 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:33:47 np0005592767 podman[225532]: 2026-01-22 22:33:47.138742189 +0000 UTC m=+0.025580955 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:33:47 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a2a5937b92ddd2ea8692275e76ac1ea0c1179de5d55bc70c5fd40062444fc92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:33:47 np0005592767 podman[225532]: 2026-01-22 22:33:47.249921035 +0000 UTC m=+0.136759801 container init bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:33:47 np0005592767 podman[225532]: 2026-01-22 22:33:47.255353218 +0000 UTC m=+0.142191954 container start bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:33:47 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [NOTICE]   (225552) : New worker (225554) forked
Jan 22 17:33:47 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [NOTICE]   (225552) : Loading success.
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.298 182627 DEBUG nova.compute.manager [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.298 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.299 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.299 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.300 182627 DEBUG nova.compute.manager [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Processing event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.300 182627 DEBUG nova.compute.manager [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.301 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.301 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.301 182627 DEBUG oslo_concurrency.lockutils [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.302 182627 DEBUG nova.compute.manager [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] No waiting events found dispatching network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.302 182627 WARNING nova.compute.manager [req-cb52c3ae-3ea3-4665-a273-418f25056926 req-c8509be9-77ff-4403-98b6-b4aa7faa2fbf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received unexpected event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.303 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.307 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121227.3072262, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.307 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.309 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.312 182627 INFO nova.virt.libvirt.driver [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Instance spawned successfully.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.312 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.334 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.340 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.342 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.342 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.343 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.343 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.343 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.344 182627 DEBUG nova.virt.libvirt.driver [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.385 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.421 182627 INFO nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Took 5.71 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.422 182627 DEBUG nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.486 182627 INFO nova.compute.manager [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Took 6.30 seconds to build instance.#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.520 182627 DEBUG oslo_concurrency.lockutils [None req-e1161278-e7e8-45d8-a612-4f583283f19d ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:47 np0005592767 nova_compute[182623]: 2026-01-22 22:33:47.593 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.962 182627 DEBUG nova.network.neutron [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Updating instance_info_cache with network_info: [{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.985 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Releasing lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.985 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance network_info: |[{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.986 182627 DEBUG oslo_concurrency.lockutils [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.987 182627 DEBUG nova.network.neutron [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Refreshing network info cache for port 97b31fc4-0356-4ece-b72f-7e18f5cc4782 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.990 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start _get_guest_xml network_info=[{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:33:48 np0005592767 nova_compute[182623]: 2026-01-22 22:33:48.994 182627 WARNING nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.000 182627 DEBUG nova.virt.libvirt.host [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.001 182627 DEBUG nova.virt.libvirt.host [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.008 182627 DEBUG nova.virt.libvirt.host [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.009 182627 DEBUG nova.virt.libvirt.host [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.010 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.010 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.011 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.011 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.012 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.012 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.013 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.013 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.013 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.014 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.014 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.014 182627 DEBUG nova.virt.hardware [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.018 182627 DEBUG nova.virt.libvirt.vif [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1215017684',display_name='tempest-ServerRescueNegativeTestJSON-server-1215017684',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1215017684',id=103,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-ygqi6u2u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:43Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=a7098cc6-e436-4068-b543-d5fc63321c86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.019 182627 DEBUG nova.network.os_vif_util [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.020 182627 DEBUG nova.network.os_vif_util [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.021 182627 DEBUG nova.objects.instance [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.035 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <uuid>a7098cc6-e436-4068-b543-d5fc63321c86</uuid>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <name>instance-00000067</name>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1215017684</nova:name>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:33:48</nova:creationTime>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:user uuid="ee8ea21df7c94181896a0576e091d6bf">tempest-ServerRescueNegativeTestJSON-554166911-project-member</nova:user>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:project uuid="793af464aeec424ead871e11355f94e3">tempest-ServerRescueNegativeTestJSON-554166911</nova:project>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        <nova:port uuid="97b31fc4-0356-4ece-b72f-7e18f5cc4782">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="serial">a7098cc6-e436-4068-b543-d5fc63321c86</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="uuid">a7098cc6-e436-4068-b543-d5fc63321c86</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b5:a8:6c"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <target dev="tap97b31fc4-03"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/console.log" append="off"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:33:49 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:33:49 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:33:49 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:33:49 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.042 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Preparing to wait for external event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.043 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.043 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.043 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.044 182627 DEBUG nova.virt.libvirt.vif [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:33:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1215017684',display_name='tempest-ServerRescueNegativeTestJSON-server-1215017684',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1215017684',id=103,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-ygqi6u2u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:43Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=a7098cc6-e436-4068-b543-d5fc63321c86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.044 182627 DEBUG nova.network.os_vif_util [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.045 182627 DEBUG nova.network.os_vif_util [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.045 182627 DEBUG os_vif [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.047 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.048 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.052 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97b31fc4-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.053 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap97b31fc4-03, col_values=(('external_ids', {'iface-id': '97b31fc4-0356-4ece-b72f-7e18f5cc4782', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:a8:6c', 'vm-uuid': 'a7098cc6-e436-4068-b543-d5fc63321c86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:49 np0005592767 NetworkManager[54973]: <info>  [1769121229.0562] manager: (tap97b31fc4-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.062 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.063 182627 INFO os_vif [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03')#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.122 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.122 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.123 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No VIF found with MAC fa:16:3e:b5:a8:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.124 182627 INFO nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Using config drive#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.840 182627 INFO nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Creating config drive at /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.847 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7wvuq7v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:33:49 np0005592767 nova_compute[182623]: 2026-01-22 22:33:49.972 182627 DEBUG oslo_concurrency.processutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7wvuq7v" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:33:50 np0005592767 kernel: tap97b31fc4-03: entered promiscuous mode
Jan 22 17:33:50 np0005592767 NetworkManager[54973]: <info>  [1769121230.0329] manager: (tap97b31fc4-03): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Jan 22 17:33:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:50Z|00385|binding|INFO|Claiming lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 for this chassis.
Jan 22 17:33:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:50Z|00386|binding|INFO|97b31fc4-0356-4ece-b72f-7e18f5cc4782: Claiming fa:16:3e:b5:a8:6c 10.100.0.7
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.043 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:a8:6c 10.100.0.7'], port_security=['fa:16:3e:b5:a8:6c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=97b31fc4-0356-4ece-b72f-7e18f5cc4782) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.044 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 97b31fc4-0356-4ece-b72f-7e18f5cc4782 in datapath e48ef497-1516-42f4-a190-f681841535fb bound to our chassis#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.046 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e48ef497-1516-42f4-a190-f681841535fb#033[00m
Jan 22 17:33:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:50Z|00387|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 ovn-installed in OVS
Jan 22 17:33:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:50Z|00388|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 up in Southbound
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:50 np0005592767 systemd-udevd[225584]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.073 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[23e0ef6d-ade6-4e8e-8eaf-dd743e07d8cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 systemd-machined[153912]: New machine qemu-50-instance-00000067.
Jan 22 17:33:50 np0005592767 NetworkManager[54973]: <info>  [1769121230.0837] device (tap97b31fc4-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:33:50 np0005592767 NetworkManager[54973]: <info>  [1769121230.0843] device (tap97b31fc4-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:33:50 np0005592767 systemd[1]: Started Virtual Machine qemu-50-instance-00000067.
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.112 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[39681bc0-cc33-4219-903d-88098d7cb195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.115 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8b289dd4-782c-4e0b-9641-929bf1e0ef46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.139 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[60f3421b-d36c-4c6d-9f1f-c2cb6fb5fc4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.155 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3ecea076-1b76-49b9-aabf-89aad31cdec5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225596, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.170 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[601c54d8-2aee-4015-a18a-7547b60cc6ef]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482331, 'tstamp': 482331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225597, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482334, 'tstamp': 482334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225597, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.172 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.173 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.174 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.175 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape48ef497-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.175 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.175 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape48ef497-10, col_values=(('external_ids', {'iface-id': 'a8d1d971-761f-423e-815b-1f3891a52bf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:33:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:33:50.175 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.312 182627 DEBUG nova.compute.manager [req-57ace0aa-22f4-466d-aeca-ea79dff43f48 req-f0d68058-9e3c-4ea5-9133-5b0d1e08d512 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.313 182627 DEBUG oslo_concurrency.lockutils [req-57ace0aa-22f4-466d-aeca-ea79dff43f48 req-f0d68058-9e3c-4ea5-9133-5b0d1e08d512 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.314 182627 DEBUG oslo_concurrency.lockutils [req-57ace0aa-22f4-466d-aeca-ea79dff43f48 req-f0d68058-9e3c-4ea5-9133-5b0d1e08d512 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.314 182627 DEBUG oslo_concurrency.lockutils [req-57ace0aa-22f4-466d-aeca-ea79dff43f48 req-f0d68058-9e3c-4ea5-9133-5b0d1e08d512 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.314 182627 DEBUG nova.compute.manager [req-57ace0aa-22f4-466d-aeca-ea79dff43f48 req-f0d68058-9e3c-4ea5-9133-5b0d1e08d512 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Processing event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.554 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121230.5529757, a7098cc6-e436-4068-b543-d5fc63321c86 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.555 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Started (Lifecycle Event)#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.559 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.565 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.570 182627 INFO nova.virt.libvirt.driver [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance spawned successfully.#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.571 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.588 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.594 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.608 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.609 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.610 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.611 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.612 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.613 182627 DEBUG nova.virt.libvirt.driver [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.620 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.621 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121230.5532355, a7098cc6-e436-4068-b543-d5fc63321c86 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.622 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.668 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.673 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121230.563364, a7098cc6-e436-4068-b543-d5fc63321c86 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.674 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.696 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.700 182627 INFO nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Took 6.93 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.701 182627 DEBUG nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.706 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.734 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.799 182627 INFO nova.compute.manager [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Took 7.50 seconds to build instance.#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.829 182627 DEBUG oslo_concurrency.lockutils [None req-2da15d94-0000-4d80-b21c-355ab855c952 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.868 182627 DEBUG nova.network.neutron [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Updated VIF entry in instance network info cache for port 97b31fc4-0356-4ece-b72f-7e18f5cc4782. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.869 182627 DEBUG nova.network.neutron [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Updating instance_info_cache with network_info: [{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:50 np0005592767 nova_compute[182623]: 2026-01-22 22:33:50.885 182627 DEBUG oslo_concurrency.lockutils [req-1149cfea-effa-407d-8525-c2f3fd063d82 req-f5b903a9-db40-447d-b9a2-0acc474ae090 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:51 np0005592767 podman[225607]: 2026-01-22 22:33:51.201735823 +0000 UTC m=+0.116414095 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64)
Jan 22 17:33:51 np0005592767 podman[225606]: 2026-01-22 22:33:51.2168454 +0000 UTC m=+0.132775078 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:33:51 np0005592767 nova_compute[182623]: 2026-01-22 22:33:51.332 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.106 182627 INFO nova.compute.manager [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Rescuing#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.107 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.108 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquired lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.108 182627 DEBUG nova.network.neutron [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.434 182627 DEBUG nova.compute.manager [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.435 182627 DEBUG oslo_concurrency.lockutils [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.436 182627 DEBUG oslo_concurrency.lockutils [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.436 182627 DEBUG oslo_concurrency.lockutils [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.436 182627 DEBUG nova.compute.manager [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:33:52 np0005592767 nova_compute[182623]: 2026-01-22 22:33:52.437 182627 WARNING nova.compute.manager [req-d9eb4b94-58aa-4a5c-b0fc-8782a1ca46b7 req-3d62fe72-9fff-42d9-9bf5-e6423ba21c46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:33:54 np0005592767 nova_compute[182623]: 2026-01-22 22:33:54.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:54 np0005592767 nova_compute[182623]: 2026-01-22 22:33:54.670 182627 DEBUG nova.network.neutron [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Updating instance_info_cache with network_info: [{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:33:54 np0005592767 nova_compute[182623]: 2026-01-22 22:33:54.691 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Releasing lock "refresh_cache-a7098cc6-e436-4068-b543-d5fc63321c86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:33:55 np0005592767 nova_compute[182623]: 2026-01-22 22:33:55.130 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:33:55 np0005592767 nova_compute[182623]: 2026-01-22 22:33:55.743 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121220.7432508, 444d90a7-b970-474d-8e18-eaab83f057d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:33:55 np0005592767 nova_compute[182623]: 2026-01-22 22:33:55.744 182627 INFO nova.compute.manager [-] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:33:55 np0005592767 nova_compute[182623]: 2026-01-22 22:33:55.760 182627 DEBUG nova.compute.manager [None req-118713f8-0148-4295-a88a-648228e5583c - - - - - -] [instance: 444d90a7-b970-474d-8e18-eaab83f057d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:33:56 np0005592767 podman[225647]: 2026-01-22 22:33:56.172997427 +0000 UTC m=+0.074196650 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 17:33:56 np0005592767 podman[225648]: 2026-01-22 22:33:56.177158905 +0000 UTC m=+0.078371999 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:33:56 np0005592767 nova_compute[182623]: 2026-01-22 22:33:56.335 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:59 np0005592767 nova_compute[182623]: 2026-01-22 22:33:59.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:33:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:59Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:8b:7a 10.100.0.11
Jan 22 17:33:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:33:59Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:8b:7a 10.100.0.11
Jan 22 17:34:01 np0005592767 nova_compute[182623]: 2026-01-22 22:34:01.338 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:03Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b5:a8:6c 10.100.0.7
Jan 22 17:34:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:03Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:a8:6c 10.100.0.7
Jan 22 17:34:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:04.000 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:34:04 np0005592767 nova_compute[182623]: 2026-01-22 22:34:04.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:04.001 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:34:04 np0005592767 nova_compute[182623]: 2026-01-22 22:34:04.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:04 np0005592767 podman[225714]: 2026-01-22 22:34:04.140860601 +0000 UTC m=+0.059637099 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:34:05 np0005592767 nova_compute[182623]: 2026-01-22 22:34:05.173 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:34:06 np0005592767 nova_compute[182623]: 2026-01-22 22:34:06.340 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.325 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000066', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '793af464aeec424ead871e11355f94e3', 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'hostId': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.330 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000067', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '793af464aeec424ead871e11355f94e3', 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'hostId': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.331 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.332 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>]
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.348 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.350 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.366 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.367 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a10896ce-646a-4916-bf34-50a47283b19a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.333181', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76a244d0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '6bb3df44729c3e1e75c63c6a7ba501eb37932367bb4a4f378a48199dfee18c17'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.333181', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76a275f4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '4f0ddd8f0e83b03b0323cf090441a0fd0ffc14095c75ae752ffc968b183e578e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-vda', 'timestamp': '2026-01-22T22:34:07.333181', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76a4db78-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': 'b43ceab2dcd3845c4211afab92ac214189766c1f32cbae372e244b242f9edfdc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-sda', 'timestamp': '2026-01-22T22:34:07.333181', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76a4f4b4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': '5939d86196d456e70ee50201f8d3ef347f1eb539ab5be1549367d4417eaf8480'}]}, 'timestamp': '2026-01-22 22:34:07.368015', '_unique_id': 'ab900499c0dc49a6aa06af8c5b363ef5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.377 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2cf209bf-cc1d-4f9f-953d-c73d3a446160 / tapb4123d25-3d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.378 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.381 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a7098cc6-e436-4068-b543-d5fc63321c86 / tap97b31fc4-03 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.381 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '708827b6-7640-4d4c-9e0b-a6398dd7b0f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.375118', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76a69bc0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': 'ac1233249409d5515ab17552f32b10bacbea98582626e1aa8adfd66fa9cba4bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.375118', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76a71d16-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '8d02d900ad064c0e80bb22d8df52be4f3c9da5c8dd3134e7a06f7521b10e0417'}]}, 'timestamp': '2026-01-22 22:34:07.382137', '_unique_id': '29f9b11e67e14f84b314862e9543bcff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 kernel: tap97b31fc4-03 (unregistering): left promiscuous mode
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.386 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.386 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f302567c-d8d6-4fce-911c-998697726212', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.386066', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76a7cce8-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': 'efbc3cc91bdc6bdd62e961ca1d95f4edaa4f299d72f23fa9952a813f5d3eb9d2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.386066', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76a7da30-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '0591ab84676e91cf986775e9c725a2bf77297bff53f490beaef06c4d97b783f5'}]}, 'timestamp': '2026-01-22 22:34:07.386835', '_unique_id': 'cbffb6c4554d4ff8b159cc1bf70dce9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:34:07 np0005592767 NetworkManager[54973]: <info>  [1769121247.3926] device (tap97b31fc4-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:34:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:07Z|00389|binding|INFO|Releasing lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 from this chassis (sb_readonly=0)
Jan 22 17:34:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:07Z|00390|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 down in Southbound
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.460 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:07Z|00391|binding|INFO|Removing iface tap97b31fc4-03 ovn-installed in OVS
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.463 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.471 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:a8:6c 10.100.0.7'], port_security=['fa:16:3e:b5:a8:6c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=97b31fc4-0356-4ece-b72f-7e18f5cc4782) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.474 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 97b31fc4-0356-4ece-b72f-7e18f5cc4782 in datapath e48ef497-1516-42f4-a190-f681841535fb unbound from our chassis#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.478 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e48ef497-1516-42f4-a190-f681841535fb#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.479 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.482 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.483 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.502 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0e67a002-0c96-4d6c-b3ce-779213f6c7e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 22 17:34:07 np0005592767 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000067.scope: Consumed 12.506s CPU time.
Jan 22 17:34:07 np0005592767 systemd-machined[153912]: Machine qemu-50-instance-00000067 terminated.
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.536 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[05d458f5-4622-43ab-a954-c6378ea5b3e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.539 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1546e7-fbb5-4618-b650-c14e12f609ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.576 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[dee8b947-9ec7-47a9-8bc6-fcbe73e4f391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.593 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e5fe5487-2e11-445d-8b32-340d58dc5303]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225749, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.615 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[59474c2a-d327-4b0f-9274-0676d97d1dfc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482331, 'tstamp': 482331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225750, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482334, 'tstamp': 482334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225750, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.616 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.618 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.623 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.624 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape48ef497-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.624 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.625 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape48ef497-10, col_values=(('external_ids', {'iface-id': 'a8d1d971-761f-423e-815b-1f3891a52bf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:07.626 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.741 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '138f8b65-0e92-4321-a981-7056cd3212e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.389590', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76b69052-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '87ae3942cd1e3c14c76b42045969b8d510d31367013d39d138f3721f49a011ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.389590', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76b6a25e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '2da08f066db740381dad3eae30c149cc8a698f97bdb1fd36582845b17df53d39'}]}, 'timestamp': '2026-01-22 22:34:07.742377', '_unique_id': '2c8fad919c884dee8fd1898b8dd040b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.745 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.747 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.747 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1858917-a995-4c29-ae04-a1771c0ca717', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.746972', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76dee37c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': '9a243caed945666e635a71751840d96436da007c19622a1ea46b65cd7b9c4ec6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.746972', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76def858-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '9f3b0c74647e0ae705c7417560990121311f16fe007d1f4dd24e8e44737df4ce'}]}, 'timestamp': '2026-01-22 22:34:07.748120', '_unique_id': 'b525d825d2f848678bf1d504b409a912'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.748 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.749 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.750 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.750 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd77a48cc-23c8-459d-a2e0-4240935886cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.749998', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76df529e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': '08fa951b14a2ddeda9bd938b87ca7b49b723adce26a8ff649fe422838471b3c8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.749998', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76df60fe-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': 'aef9b3a2252023b3c4292207226efcfd70511ef9588df9307e3564ab8cd34149'}]}, 'timestamp': '2026-01-22 22:34:07.750755', '_unique_id': '9d0b386ea9b347edab151c059498e907'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.751 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.752 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.752 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0b101f9-03d0-42c7-8cd5-fb01fa30bd3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.752476', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76dfb1da-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': '13dab7cf13dbda065eade164c8771fb383457c56061af09e61d5029458f2d6c4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.752476', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76dfbe32-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': 'b7424275316dc76f29da3887f815a5809ea44735ee0dfd933889655e5cb6bc13'}]}, 'timestamp': '2026-01-22 22:34:07.753162', '_unique_id': 'bdd52080979547209a0680439a22c8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.753 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.754 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.775 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/memory.usage volume: 42.7734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.776 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6db519ea-1065-401a-b02a-6375d8633ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7734375, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'timestamp': '2026-01-22T22:34:07.755095', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '76e33ad0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.410356025, 'message_signature': '20661906b05674dab7af692c2a53d296edd52b48f132fb4ad5323ba2f9bc23b9'}]}, 'timestamp': '2026-01-22 22:34:07.776939', '_unique_id': '03541215a7d1413abd0fd498cbc446e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.777 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.779 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.requests volume: 1102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.779 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.780 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2d4251d-8c05-4fc7-b50d-3300adb4c964', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1102, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.779420', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e3d148-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': 'f1dc3380dafc38fdbbfe5a57e14ae1c0dd9d3b840a7a2427d0862e092c283f0f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.779420', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e3e084-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '055a768740a398518338ce87a61c3e2aa8dec9fd0b7b2b0a2048f259944eda15'}]}, 'timestamp': '2026-01-22 22:34:07.781100', '_unique_id': '4f7e58574beb4bd681e0379114ece940'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.781 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.783 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.783 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1f878f9-d64c-47f9-b2b2-7b9d5a0095d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.783449', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76e46ff4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': '767c0b61293d4c82135ccf7ad99d223b478972c8c71df68efe9ac4d029d36f6d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.783449', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76e48444-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '11bfc69594f2e97f5b4a942850c8f325d54e74dd2535d5279e4c24e1f37a428b'}]}, 'timestamp': '2026-01-22 22:34:07.784480', '_unique_id': '30963166a9ff4aa799e7e25db6d8ee56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.785 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.787 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.787 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>]
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.787 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.787 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.bytes volume: 30513664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.788 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.789 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f39e6c18-0968-43d0-a0fd-97bd30cd3ca0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30513664, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.787879', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e51d3c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '0a3b817d31e5c4a74537139eacc065333416a4b468a6473c0909a85773fa359f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.787879', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e5307e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '01afe1adc73f1236cb37b1417c851f958434d24459094328e5ff4aa47e50fa00'}]}, 'timestamp': '2026-01-22 22:34:07.789912', '_unique_id': 'fe28a93b2a31407faa96e8fd392c338d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.790 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.792 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.792 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/cpu volume: 11300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.793 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '385a2191-0e09-4c0c-afb4-a1761526982d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11300000000, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'timestamp': '2026-01-22T22:34:07.792475', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '76e5d010-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.410356025, 'message_signature': '09ff2ab7a3d419e06b7f7867b14f6d4cd0878a250ae147ffcb60600e34af9495'}]}, 'timestamp': '2026-01-22 22:34:07.793846', '_unique_id': 'ee9e9e1952134f48ac556bfbed5ed366'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.794 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.796 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.796 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f175a02a-4081-4438-9064-9bf9ced0551a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.796297', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76e665c0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': 'd63bd3de4af9ddd337d76f6a9ce481e106f24eeb8ecab74ccb16a70554f49e6e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.796297', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76e675ce-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '0d6f1dd4c29e2ee27a44837ded0f08e261036129fc19f850abd8c33420db0b2c'}]}, 'timestamp': '2026-01-22 22:34:07.797143', '_unique_id': 'f1638a94d1704310b2b80d450e1289c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.797 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.799 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.799 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>]
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.800 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.800 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.800 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.801 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd848e61-ddd3-46f6-91ac-5cbc9fa296ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.799976', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e6f404-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '05895119f0bee95f7becff7bc0a45989533571700ca78b0b5791c11b9aad0b63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.799976', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e70214-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '9f087d1859bf55ed88c3da4e901388d722860f39948aec2c84218489b92b1953'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-vda', 'timestamp': '2026-01-22T22:34:07.799976', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e70fb6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': '0969a54d5cb1c4d69b656427ad1fe46d9dff75bfd3e8a0a88e136aca4c4e721b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-sda', 'timestamp': '2026-01-22T22:34:07.799976', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e71e02-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': '3deebb7ef17a1583957e26285c29485ac9e48e5aa059cb40cfad8ce73c7d54e7'}]}, 'timestamp': '2026-01-22 22:34:07.801441', '_unique_id': 'f34bf87f55ce40d9a11f410bc14ec824'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.802 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.803 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.latency volume: 200007051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.804 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.read.latency volume: 28623801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.805 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '230095e2-921a-45b5-8de4-808ae78a74cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200007051, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.803678', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e78400-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '24f4a12e97cd4673c009c03d0c09d5daba2dfc782eb1d40f4de20d92c8c0429c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28623801, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.803678', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e7926a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '53c213197b864dfd9b8b2ed9929965fc998ec32d358ca3fcfac28817e7dde66d'}]}, 'timestamp': '2026-01-22 22:34:07.805708', '_unique_id': 'a0e54c1049974e06936bfcab114f97c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.806 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.807 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.latency volume: 1850778515 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.808 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.809 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09f4c70a-6e15-4598-ae2f-79c3aa136231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1850778515, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.807947', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76e82d38-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '9b8945d8c87073622ea7b37a15a15d8eff7dab2049d8dec21f231b87712f9ccc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.807947', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76e83b02-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '801ccd370526f326ca5b766efa0606c9fd1dc69df11237fb26f7625d758332e6'}]}, 'timestamp': '2026-01-22 22:34:07.809423', '_unique_id': '2246eb43e34b4e1486ec2222e316ea47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.810 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.812 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.812 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.813 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2580b5a7-edc6-453b-bd1a-69616ae757e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.812446', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76e8e26e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': 'c6f3b830b9f6551af7815da86b66e24acfd70590384d6c70b3f129793400fd88'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.812446', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76e8fbe6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '184d0462732155ef8530964eaf27332a7d7e3763e6fa5f4c8e3c4b06d0a79861'}]}, 'timestamp': '2026-01-22 22:34:07.813776', '_unique_id': '5176942368a24cb98daeb25ec996c753'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.815 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.816 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.817 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.817 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cd38096-7cd4-4952-a926-636d939b767e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.817104', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76e995ce-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': 'fc51b4d798d4ff25c3d0ebea4caebeb25e1352fc6c0fa43742118cacbb389464'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.817104', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76e9a92e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '41e86686e82385de19b848a591570a3bd7ecf6186bf22a214b313ad019c39941'}]}, 'timestamp': '2026-01-22 22:34:07.818296', '_unique_id': '044bff228ec64067834257f5f155f881'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.819 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.821 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.822 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.bytes volume: 72847360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.823 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.824 12 DEBUG ceilometer.compute.pollsters [-] Instance a7098cc6-e436-4068-b543-d5fc63321c86 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000067, id=a7098cc6-e436-4068-b543-d5fc63321c86>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f37b89a9-b4e9-44f1-8406-46e76323aad3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72847360, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.822292', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76ea647c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '8f6bf3f82fbdb26ad62322ae7934f0266aa0a9b1b6865b4197f49bc5f194f6c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.822292', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76ea866e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.024484997, 'message_signature': '2dc49ab6033f6fe80d44ae613b68cb0dc206d06505c02b5a7da32b5af38cb91f'}]}, 'timestamp': '2026-01-22 22:34:07.825232', '_unique_id': '15eefaa3dc2548dbb3abb09692b06a58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.826 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.829 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.829 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.830 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.830 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eedcbe56-2c5e-4c6a-b0b3-1b3382c8fc3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-vda', 'timestamp': '2026-01-22T22:34:07.829534', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76eb74c0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '9d961a0307a7ee9f7420978d21b496f12853f08b5aef9cc99afad857bed9d85c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160-sda', 'timestamp': '2026-01-22T22:34:07.829534', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'instance-00000066', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76eb7e7a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.968136482, 'message_signature': '99eeb660e3528757744da38e526d1100b65462051d9193335eeb9edb10c6c458'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-vda', 'timestamp': '2026-01-22T22:34:07.829534', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76eb8852-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': '257dcbbe763ec6a42ae34cf33e73c060678b3a42a4029eaf5e5261717c0225e2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'a7098cc6-e436-4068-b543-d5fc63321c86-sda', 'timestamp': '2026-01-22T22:34:07.829534', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'instance-00000067', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76eb90ea-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4843.986463101, 'message_signature': '0833b9b892c6c64f067cc24dc32da13f25d55518ff31a2ecdebf3bb1b1dfffb4'}]}, 'timestamp': '2026-01-22 22:34:07.830560', '_unique_id': '5f3f8afcdcde4d25a814fb3a826eaecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.831 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2065365074>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1215017684>]
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 DEBUG ceilometer.compute.pollsters [-] 2cf209bf-cc1d-4f9f-953d-c73d3a446160/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.832 12 DEBUG ceilometer.compute.pollsters [-] a7098cc6-e436-4068-b543-d5fc63321c86/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c2bc509-a619-4d5e-8fa5-d86256f6dc96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000066-2cf209bf-cc1d-4f9f-953d-c73d3a446160-tapb4123d25-3d', 'timestamp': '2026-01-22T22:34:07.832479', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2065365074', 'name': 'tapb4123d25-3d', 'instance_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:8b:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4123d25-3d'}, 'message_id': '76ebe522-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.010070999, 'message_signature': '1a217250f0abfc5bf3c823467c8dcc558ff696b593a6dbc5f14122d988ff4a08'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee8ea21df7c94181896a0576e091d6bf', 'user_name': None, 'project_id': '793af464aeec424ead871e11355f94e3', 'project_name': None, 'resource_id': 'instance-00000067-a7098cc6-e436-4068-b543-d5fc63321c86-tap97b31fc4-03', 'timestamp': '2026-01-22T22:34:07.832479', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1215017684', 'name': 'tap97b31fc4-03', 'instance_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'instance_type': 'm1.nano', 'host': 'e37ea0167d470aa7e9f9a6cf5e1da87632d63b95b70a8c4d2347a4b8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b5:a8:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97b31fc4-03'}, 'message_id': '76ebed88-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4844.013690581, 'message_signature': '1aa6ada3cccffb1fbab61b17c6f9c2f8a8f669856dfd2b49f9cb2575f8740fec'}]}, 'timestamp': '2026-01-22 22:34:07.832921', '_unique_id': 'f54f88d08fc444a3aea2fac4d40278d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:34:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:34:07.833 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.849 182627 DEBUG nova.compute.manager [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.850 182627 DEBUG oslo_concurrency.lockutils [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.850 182627 DEBUG oslo_concurrency.lockutils [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.850 182627 DEBUG oslo_concurrency.lockutils [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.850 182627 DEBUG nova.compute.manager [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:07 np0005592767 nova_compute[182623]: 2026-01-22 22:34:07.851 182627 WARNING nova.compute.manager [req-fd57cb7b-b89e-47e7-9462-5cd71043677d req-2eaa23a2-0e09-41f4-8166-91d16759a643 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.183 182627 INFO nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.189 182627 INFO nova.virt.libvirt.driver [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance destroyed successfully.#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.189 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'numa_topology' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.205 182627 INFO nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Attempting rescue#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.206 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.209 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.209 182627 INFO nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Creating image(s)#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.209 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.210 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.210 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.210 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.232 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.232 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.243 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.298 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.299 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.333 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.rescue" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.334 182627 DEBUG oslo_concurrency.lockutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.334 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'migration_context' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.347 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.348 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start _get_guest_xml network_info=[{"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "vif_mac": "fa:16:3e:b5:a8:6c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.348 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'resources' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.381 182627 WARNING nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.387 182627 DEBUG nova.virt.libvirt.host [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.388 182627 DEBUG nova.virt.libvirt.host [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.391 182627 DEBUG nova.virt.libvirt.host [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.391 182627 DEBUG nova.virt.libvirt.host [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.393 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.393 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.393 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.393 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.394 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.394 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.394 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.394 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.394 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.395 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.395 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.395 182627 DEBUG nova.virt.hardware [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.395 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.409 182627 DEBUG nova.virt.libvirt.vif [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1215017684',display_name='tempest-ServerRescueNegativeTestJSON-server-1215017684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1215017684',id=103,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:33:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-ygqi6u2u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:33:50Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=a7098cc6-e436-4068-b543-d5fc63321c86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "vif_mac": "fa:16:3e:b5:a8:6c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.409 182627 DEBUG nova.network.os_vif_util [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "vif_mac": "fa:16:3e:b5:a8:6c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.410 182627 DEBUG nova.network.os_vif_util [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.411 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.423 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <uuid>a7098cc6-e436-4068-b543-d5fc63321c86</uuid>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <name>instance-00000067</name>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1215017684</nova:name>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:34:08</nova:creationTime>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:user uuid="ee8ea21df7c94181896a0576e091d6bf">tempest-ServerRescueNegativeTestJSON-554166911-project-member</nova:user>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:project uuid="793af464aeec424ead871e11355f94e3">tempest-ServerRescueNegativeTestJSON-554166911</nova:project>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        <nova:port uuid="97b31fc4-0356-4ece-b72f-7e18f5cc4782">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="serial">a7098cc6-e436-4068-b543-d5fc63321c86</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="uuid">a7098cc6-e436-4068-b543-d5fc63321c86</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.rescue"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <target dev="vdb" bus="virtio"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config.rescue"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b5:a8:6c"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <target dev="tap97b31fc4-03"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/console.log" append="off"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:34:08 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:34:08 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:34:08 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:34:08 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.430 182627 INFO nova.virt.libvirt.driver [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance destroyed successfully.#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.478 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.479 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.479 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.479 182627 DEBUG nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] No VIF found with MAC fa:16:3e:b5:a8:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.480 182627 INFO nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Using config drive#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.493 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:08 np0005592767 nova_compute[182623]: 2026-01-22 22:34:08.519 182627 DEBUG nova.objects.instance [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'keypairs' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.066 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.148 182627 INFO nova.virt.libvirt.driver [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Creating config drive at /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config.rescue#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.156 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwol4gglu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.288 182627 DEBUG oslo_concurrency.processutils [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwol4gglu" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:09 np0005592767 kernel: tap97b31fc4-03: entered promiscuous mode
Jan 22 17:34:09 np0005592767 systemd-udevd[225740]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:34:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:09Z|00392|binding|INFO|Claiming lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 for this chassis.
Jan 22 17:34:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:09Z|00393|binding|INFO|97b31fc4-0356-4ece-b72f-7e18f5cc4782: Claiming fa:16:3e:b5:a8:6c 10.100.0.7
Jan 22 17:34:09 np0005592767 NetworkManager[54973]: <info>  [1769121249.4226] manager: (tap97b31fc4-03): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.429 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:a8:6c 10.100.0.7'], port_security=['fa:16:3e:b5:a8:6c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=97b31fc4-0356-4ece-b72f-7e18f5cc4782) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.430 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 97b31fc4-0356-4ece-b72f-7e18f5cc4782 in datapath e48ef497-1516-42f4-a190-f681841535fb bound to our chassis#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.432 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e48ef497-1516-42f4-a190-f681841535fb#033[00m
Jan 22 17:34:09 np0005592767 NetworkManager[54973]: <info>  [1769121249.4342] device (tap97b31fc4-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:34:09 np0005592767 NetworkManager[54973]: <info>  [1769121249.4354] device (tap97b31fc4-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:34:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:09Z|00394|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 ovn-installed in OVS
Jan 22 17:34:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:09Z|00395|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 up in Southbound
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.451 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d37dd2-64c7-4f56-b74b-fd6234d8ce23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 systemd-machined[153912]: New machine qemu-51-instance-00000067.
Jan 22 17:34:09 np0005592767 systemd[1]: Started Virtual Machine qemu-51-instance-00000067.
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.482 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ab711843-f004-432d-b378-a25db53123fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.484 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[55d10cde-54b8-4d9f-8c6c-6153c31c35f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.510 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e47c03f9-e3ef-49a7-b253-7405cfaa4436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.534 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9ec9b490-4c58-4b9e-a740-df8177a3513c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225807, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.556 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8bb111-b9ef-4fe4-9ab9-d782a6e0edd3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482331, 'tstamp': 482331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225809, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482334, 'tstamp': 482334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225809, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.558 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.559 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.562 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape48ef497-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.563 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.563 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape48ef497-10, col_values=(('external_ids', {'iface-id': 'a8d1d971-761f-423e-815b-1f3891a52bf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:09.563 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.809 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for a7098cc6-e436-4068-b543-d5fc63321c86 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.810 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121249.8087554, a7098cc6-e436-4068-b543-d5fc63321c86 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.810 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.827 182627 DEBUG nova.compute.manager [None req-f6d83b3b-ae8b-4eae-aaa5-f6053d410ca9 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.848 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.854 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.890 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.891 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121249.810573, a7098cc6-e436-4068-b543-d5fc63321c86 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.891 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Started (Lifecycle Event)#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.953 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:09 np0005592767 nova_compute[182623]: 2026-01-22 22:34:09.957 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.007 182627 DEBUG nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.008 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.009 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.009 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.009 182627 DEBUG nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.010 182627 WARNING nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.010 182627 DEBUG nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.011 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.011 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.011 182627 DEBUG oslo_concurrency.lockutils [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.012 182627 DEBUG nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:10 np0005592767 nova_compute[182623]: 2026-01-22 22:34:10.012 182627 WARNING nova.compute.manager [req-a2756190-f35f-470d-aa82-09b63ac732eb req-b5a5ddb8-0fb0-4a2d-87e9-8244c3a8ea78 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:34:11 np0005592767 nova_compute[182623]: 2026-01-22 22:34:11.343 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:12.105 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:12.106 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:12.107 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.152 182627 DEBUG nova.compute.manager [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.153 182627 DEBUG oslo_concurrency.lockutils [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.153 182627 DEBUG oslo_concurrency.lockutils [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.154 182627 DEBUG oslo_concurrency.lockutils [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.154 182627 DEBUG nova.compute.manager [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:12 np0005592767 nova_compute[182623]: 2026-01-22 22:34:12.155 182627 WARNING nova.compute.manager [req-c240b3dd-fa4e-412b-9ae8-9e11807e06ed req-a393d470-4f33-411d-8bb3-bd05a4273ee5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:34:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:13.004 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.257 182627 INFO nova.compute.manager [None req-f68965d7-b462-487c-a9c4-a97944a35e5f ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Pausing#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.259 182627 DEBUG nova.objects.instance [None req-f68965d7-b462-487c-a9c4-a97944a35e5f ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'flavor' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.322 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121253.3219757, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.323 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.325 182627 DEBUG nova.compute.manager [None req-f68965d7-b462-487c-a9c4-a97944a35e5f ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.344 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.348 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:13 np0005592767 nova_compute[182623]: 2026-01-22 22:34:13.376 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 22 17:34:14 np0005592767 nova_compute[182623]: 2026-01-22 22:34:14.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:15 np0005592767 podman[225817]: 2026-01-22 22:34:15.164584951 +0000 UTC m=+0.080302753 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.522 182627 INFO nova.compute.manager [None req-be3f8796-f1cc-464b-8e56-07a3b751cdab ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Unpausing#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.523 182627 DEBUG nova.objects.instance [None req-be3f8796-f1cc-464b-8e56-07a3b751cdab ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'flavor' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.559 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121255.5594084, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.560 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:34:15 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.568 182627 DEBUG nova.virt.libvirt.guest [None req-be3f8796-f1cc-464b-8e56-07a3b751cdab ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.568 182627 DEBUG nova.compute.manager [None req-be3f8796-f1cc-464b-8e56-07a3b751cdab ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.583 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.587 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:15 np0005592767 nova_compute[182623]: 2026-01-22 22:34:15.617 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 22 17:34:16 np0005592767 nova_compute[182623]: 2026-01-22 22:34:16.344 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:17 np0005592767 nova_compute[182623]: 2026-01-22 22:34:17.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.232 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.233 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.233 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.234 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.234 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.252 182627 INFO nova.compute.manager [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Terminating instance#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.266 182627 DEBUG nova.compute.manager [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:34:18 np0005592767 kernel: tap97b31fc4-03 (unregistering): left promiscuous mode
Jan 22 17:34:18 np0005592767 NetworkManager[54973]: <info>  [1769121258.2944] device (tap97b31fc4-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:34:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:18Z|00396|binding|INFO|Releasing lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 from this chassis (sb_readonly=0)
Jan 22 17:34:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:18Z|00397|binding|INFO|Setting lport 97b31fc4-0356-4ece-b72f-7e18f5cc4782 down in Southbound
Jan 22 17:34:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:18Z|00398|binding|INFO|Removing iface tap97b31fc4-03 ovn-installed in OVS
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.315 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.320 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:a8:6c 10.100.0.7'], port_security=['fa:16:3e:b5:a8:6c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7098cc6-e436-4068-b543-d5fc63321c86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=97b31fc4-0356-4ece-b72f-7e18f5cc4782) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.321 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 97b31fc4-0356-4ece-b72f-7e18f5cc4782 in datapath e48ef497-1516-42f4-a190-f681841535fb unbound from our chassis#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.324 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e48ef497-1516-42f4-a190-f681841535fb#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.341 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[399e04c8-8e3b-43a3-a2a9-eb3d0fe95d37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 22 17:34:18 np0005592767 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000067.scope: Consumed 9.015s CPU time.
Jan 22 17:34:18 np0005592767 systemd-machined[153912]: Machine qemu-51-instance-00000067 terminated.
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.370 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e5cf84dd-3578-4102-98fc-b5669d523775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.373 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8727a42d-24ce-455e-a847-ef95e80a910e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.398 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ae72f8-8e58-434e-9efe-aaedce20b81f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.416 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[01a4d00d-e23f-44f0-ab41-f61abcc046e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape48ef497-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:e0:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482317, 'reachable_time': 40042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225849, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.432 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ea46c3-99c5-452a-98d8-901282209fcc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482331, 'tstamp': 482331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225850, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape48ef497-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482334, 'tstamp': 482334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225850, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.433 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.434 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.439 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape48ef497-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.439 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.439 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape48ef497-10, col_values=(('external_ids', {'iface-id': 'a8d1d971-761f-423e-815b-1f3891a52bf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:18.440 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.541 182627 DEBUG nova.compute.manager [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.541 182627 DEBUG oslo_concurrency.lockutils [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.542 182627 DEBUG oslo_concurrency.lockutils [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.542 182627 DEBUG oslo_concurrency.lockutils [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.542 182627 DEBUG nova.compute.manager [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.543 182627 DEBUG nova.compute.manager [req-39e57219-fc09-493d-a9ba-23cea4cef02d req-7dea277b-1bc5-4f3e-90dd-74ecbc7508d0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-unplugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.546 182627 INFO nova.virt.libvirt.driver [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Instance destroyed successfully.#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.547 182627 DEBUG nova.objects.instance [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'resources' on Instance uuid a7098cc6-e436-4068-b543-d5fc63321c86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.557 182627 DEBUG nova.virt.libvirt.vif [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1215017684',display_name='tempest-ServerRescueNegativeTestJSON-server-1215017684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1215017684',id=103,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-ygqi6u2u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:34:09Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=a7098cc6-e436-4068-b543-d5fc63321c86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.558 182627 DEBUG nova.network.os_vif_util [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "address": "fa:16:3e:b5:a8:6c", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b31fc4-03", "ovs_interfaceid": "97b31fc4-0356-4ece-b72f-7e18f5cc4782", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.559 182627 DEBUG nova.network.os_vif_util [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.559 182627 DEBUG os_vif [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.562 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97b31fc4-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.564 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.566 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.568 182627 INFO os_vif [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:a8:6c,bridge_name='br-int',has_traffic_filtering=True,id=97b31fc4-0356-4ece-b72f-7e18f5cc4782,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b31fc4-03')#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.569 182627 INFO nova.virt.libvirt.driver [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Deleting instance files /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86_del#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.570 182627 INFO nova.virt.libvirt.driver [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Deletion of /var/lib/nova/instances/a7098cc6-e436-4068-b543-d5fc63321c86_del complete#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.654 182627 INFO nova.compute.manager [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.655 182627 DEBUG oslo.service.loopingcall [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.655 182627 DEBUG nova.compute.manager [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:34:18 np0005592767 nova_compute[182623]: 2026-01-22 22:34:18.655 182627 DEBUG nova.network.neutron [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.461 182627 DEBUG nova.network.neutron [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.479 182627 INFO nova.compute.manager [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Took 0.82 seconds to deallocate network for instance.#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.543 182627 DEBUG nova.compute.manager [req-789c9f14-976f-4e91-af51-ddd6e4148ca9 req-ccb34476-b142-4394-bb5c-f93227463a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-deleted-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.568 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.569 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.826 182627 DEBUG nova.compute.provider_tree [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.870 182627 DEBUG nova.scheduler.client.report [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.905 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:19 np0005592767 nova_compute[182623]: 2026-01-22 22:34:19.966 182627 INFO nova.scheduler.client.report [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Deleted allocations for instance a7098cc6-e436-4068-b543-d5fc63321c86#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.112 182627 DEBUG oslo_concurrency.lockutils [None req-ab945335-2723-41b7-9d18-c66b9f774442 ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.614 182627 DEBUG nova.compute.manager [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.615 182627 DEBUG oslo_concurrency.lockutils [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.615 182627 DEBUG oslo_concurrency.lockutils [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.616 182627 DEBUG oslo_concurrency.lockutils [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a7098cc6-e436-4068-b543-d5fc63321c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.616 182627 DEBUG nova.compute.manager [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] No waiting events found dispatching network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.616 182627 WARNING nova.compute.manager [req-d22449ed-3460-43e7-a8d5-6b972ba9a637 req-6ad609fe-f0b0-481a-bddb-a47522280b89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Received unexpected event network-vif-plugged-97b31fc4-0356-4ece-b72f-7e18f5cc4782 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:34:20 np0005592767 nova_compute[182623]: 2026-01-22 22:34:20.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.289 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.290 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.290 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.291 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.291 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.308 182627 INFO nova.compute.manager [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Terminating instance#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.321 182627 DEBUG nova.compute.manager [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:34:21 np0005592767 kernel: tapb4123d25-3d (unregistering): left promiscuous mode
Jan 22 17:34:21 np0005592767 NetworkManager[54973]: <info>  [1769121261.3477] device (tapb4123d25-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.367 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:21Z|00399|binding|INFO|Releasing lport b4123d25-3dda-4aac-84ab-b88e969ec9a7 from this chassis (sb_readonly=0)
Jan 22 17:34:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:21Z|00400|binding|INFO|Setting lport b4123d25-3dda-4aac-84ab-b88e969ec9a7 down in Southbound
Jan 22 17:34:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:34:21Z|00401|binding|INFO|Removing iface tapb4123d25-3d ovn-installed in OVS
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.369 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.373 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:8b:7a 10.100.0.11'], port_security=['fa:16:3e:45:8b:7a 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2cf209bf-cc1d-4f9f-953d-c73d3a446160', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e48ef497-1516-42f4-a190-f681841535fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793af464aeec424ead871e11355f94e3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8c5026e2-04ae-4fa8-b97c-9c07f7527d9c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a172c89f-e8b1-4850-8807-9110ecdce271, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b4123d25-3dda-4aac-84ab-b88e969ec9a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.374 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b4123d25-3dda-4aac-84ab-b88e969ec9a7 in datapath e48ef497-1516-42f4-a190-f681841535fb unbound from our chassis#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.376 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e48ef497-1516-42f4-a190-f681841535fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.376 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae96d8f-6508-465a-9508-f34333a2ff25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.377 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e48ef497-1516-42f4-a190-f681841535fb namespace which is not needed anymore#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.383 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.388 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.389 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.389 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.389 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:21 np0005592767 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000066.scope: Deactivated successfully.
Jan 22 17:34:21 np0005592767 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000066.scope: Consumed 13.190s CPU time.
Jan 22 17:34:21 np0005592767 systemd-machined[153912]: Machine qemu-49-instance-00000066 terminated.
Jan 22 17:34:21 np0005592767 podman[225877]: 2026-01-22 22:34:21.504831562 +0000 UTC m=+0.103470589 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 17:34:21 np0005592767 podman[225875]: 2026-01-22 22:34:21.531078114 +0000 UTC m=+0.129262728 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.540 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.545 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [NOTICE]   (225552) : haproxy version is 2.8.14-c23fe91
Jan 22 17:34:21 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [NOTICE]   (225552) : path to executable is /usr/sbin/haproxy
Jan 22 17:34:21 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [WARNING]  (225552) : Exiting Master process...
Jan 22 17:34:21 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [ALERT]    (225552) : Current worker (225554) exited with code 143 (Terminated)
Jan 22 17:34:21 np0005592767 neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb[225548]: [WARNING]  (225552) : All workers exited. Exiting... (0)
Jan 22 17:34:21 np0005592767 systemd[1]: libpod-bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a.scope: Deactivated successfully.
Jan 22 17:34:21 np0005592767 podman[225943]: 2026-01-22 22:34:21.557537533 +0000 UTC m=+0.049916684 container died bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.583 182627 INFO nova.virt.libvirt.driver [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Instance destroyed successfully.#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.583 182627 DEBUG nova.objects.instance [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lazy-loading 'resources' on Instance uuid 2cf209bf-cc1d-4f9f-953d-c73d3a446160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:21 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:34:21 np0005592767 systemd[1]: var-lib-containers-storage-overlay-6a2a5937b92ddd2ea8692275e76ac1ea0c1179de5d55bc70c5fd40062444fc92-merged.mount: Deactivated successfully.
Jan 22 17:34:21 np0005592767 podman[225943]: 2026-01-22 22:34:21.595390524 +0000 UTC m=+0.087769675 container cleanup bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.597 182627 DEBUG nova.virt.libvirt.vif [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2065365074',display_name='tempest-ServerRescueNegativeTestJSON-server-2065365074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2065365074',id=102,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:33:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='793af464aeec424ead871e11355f94e3',ramdisk_id='',reservation_id='r-kow01kp3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-554166911',owner_user_name='tempest-ServerRescueNegativeTestJSON-554166911-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:34:15Z,user_data=None,user_id='ee8ea21df7c94181896a0576e091d6bf',uuid=2cf209bf-cc1d-4f9f-953d-c73d3a446160,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.598 182627 DEBUG nova.network.os_vif_util [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converting VIF {"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.599 182627 DEBUG nova.network.os_vif_util [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.599 182627 DEBUG os_vif [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.601 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.601 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb4123d25-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:21 np0005592767 systemd[1]: libpod-conmon-bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a.scope: Deactivated successfully.
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.603 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.606 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.608 182627 INFO os_vif [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:8b:7a,bridge_name='br-int',has_traffic_filtering=True,id=b4123d25-3dda-4aac-84ab-b88e969ec9a7,network=Network(e48ef497-1516-42f4-a190-f681841535fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4123d25-3d')#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.608 182627 INFO nova.virt.libvirt.driver [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Deleting instance files /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160_del#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.609 182627 INFO nova.virt.libvirt.driver [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Deletion of /var/lib/nova/instances/2cf209bf-cc1d-4f9f-953d-c73d3a446160_del complete#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.646 182627 DEBUG nova.compute.manager [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-unplugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.647 182627 DEBUG oslo_concurrency.lockutils [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.647 182627 DEBUG oslo_concurrency.lockutils [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.647 182627 DEBUG oslo_concurrency.lockutils [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.647 182627 DEBUG nova.compute.manager [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] No waiting events found dispatching network-vif-unplugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.648 182627 DEBUG nova.compute.manager [req-a30d97f5-7cd2-496e-9b6e-156093c2fa0e req-a95edf0d-635e-4085-ace2-56b1ac6f1ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-unplugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:34:21 np0005592767 podman[225985]: 2026-01-22 22:34:21.656276267 +0000 UTC m=+0.040583480 container remove bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.661 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18e45db8-a61a-42a8-9f13-02e598df6829]: (4, ('Thu Jan 22 10:34:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb (bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a)\nbea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a\nThu Jan 22 10:34:21 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e48ef497-1516-42f4-a190-f681841535fb (bea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a)\nbea5b37cfb8af1293000f4711d2657de5974b086ca53aa88ec137a33828cbb3a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.662 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dede82-09ad-414f-9c49-fa0faca0aae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.663 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape48ef497-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.665 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 kernel: tape48ef497-10: left promiscuous mode
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.682 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.685 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3778063c-aac8-457a-9c0e-3a3176a58bb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.707 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5bed1260-e213-48cf-8b8e-ad8fdd69e139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.708 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[909e486f-a22f-4d04-9d8f-bca9e4e41230]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.714 182627 INFO nova.compute.manager [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.715 182627 DEBUG oslo.service.loopingcall [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.716 182627 DEBUG nova.compute.manager [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:34:21 np0005592767 nova_compute[182623]: 2026-01-22 22:34:21.716 182627 DEBUG nova.network.neutron [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.723 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c581be4-0c7f-487b-a51a-fb80b4cd9448]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482309, 'reachable_time': 30179, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226000, 'error': None, 'target': 'ovnmeta-e48ef497-1516-42f4-a190-f681841535fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.726 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e48ef497-1516-42f4-a190-f681841535fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:34:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:34:21.727 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[091b808a-93fa-438a-bb1e-7ad2ab1c86a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:34:21 np0005592767 systemd[1]: run-netns-ovnmeta\x2de48ef497\x2d1516\x2d42f4\x2da190\x2df681841535fb.mount: Deactivated successfully.
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.043 182627 DEBUG nova.network.neutron [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.065 182627 DEBUG nova.compute.manager [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.065 182627 DEBUG oslo_concurrency.lockutils [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.066 182627 DEBUG oslo_concurrency.lockutils [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.066 182627 DEBUG oslo_concurrency.lockutils [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.067 182627 DEBUG nova.compute.manager [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] No waiting events found dispatching network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.067 182627 WARNING nova.compute.manager [req-174f2d0d-7c3f-4f39-a29e-ff7e53d6c916 req-a16e2a08-b5ae-43ac-9fc7-40309612ed1a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received unexpected event network-vif-plugged-b4123d25-3dda-4aac-84ab-b88e969ec9a7 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.069 182627 INFO nova.compute.manager [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Took 2.35 seconds to deallocate network for instance.#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.147 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.148 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.181 182627 DEBUG nova.compute.manager [req-815fb2fa-2a1e-4c30-8031-111d2bc13a90 req-027d5285-79d8-4f28-80f4-efc7041588dc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Received event network-vif-deleted-b4123d25-3dda-4aac-84ab-b88e969ec9a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.208 182627 DEBUG nova.compute.provider_tree [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.418 182627 DEBUG nova.scheduler.client.report [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.452 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.500 182627 INFO nova.scheduler.client.report [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Deleted allocations for instance 2cf209bf-cc1d-4f9f-953d-c73d3a446160#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.615 182627 DEBUG oslo_concurrency.lockutils [None req-f5859ffa-3f31-450f-acb1-0c3d8d4543ff ee8ea21df7c94181896a0576e091d6bf 793af464aeec424ead871e11355f94e3 - - default default] Lock "2cf209bf-cc1d-4f9f-953d-c73d3a446160" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.659 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updating instance_info_cache with network_info: [{"id": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "address": "fa:16:3e:45:8b:7a", "network": {"id": "e48ef497-1516-42f4-a190-f681841535fb", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-896892098-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "793af464aeec424ead871e11355f94e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4123d25-3d", "ovs_interfaceid": "b4123d25-3dda-4aac-84ab-b88e969ec9a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.680 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-2cf209bf-cc1d-4f9f-953d-c73d3a446160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.680 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.681 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.681 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.681 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.682 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.682 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.702 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.703 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.703 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.703 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.873 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.875 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5676MB free_disk=73.23091888427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.875 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.875 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.949 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.950 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:34:24 np0005592767 nova_compute[182623]: 2026-01-22 22:34:24.986 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:34:25 np0005592767 nova_compute[182623]: 2026-01-22 22:34:25.012 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:34:25 np0005592767 nova_compute[182623]: 2026-01-22 22:34:25.043 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:34:25 np0005592767 nova_compute[182623]: 2026-01-22 22:34:25.043 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:25 np0005592767 nova_compute[182623]: 2026-01-22 22:34:25.044 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:25 np0005592767 nova_compute[182623]: 2026-01-22 22:34:25.044 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:34:26 np0005592767 nova_compute[182623]: 2026-01-22 22:34:26.449 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:26 np0005592767 nova_compute[182623]: 2026-01-22 22:34:26.603 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:26 np0005592767 nova_compute[182623]: 2026-01-22 22:34:26.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:26 np0005592767 nova_compute[182623]: 2026-01-22 22:34:26.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:26 np0005592767 nova_compute[182623]: 2026-01-22 22:34:26.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:27 np0005592767 podman[226003]: 2026-01-22 22:34:27.162385864 +0000 UTC m=+0.081763113 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:34:27 np0005592767 podman[226004]: 2026-01-22 22:34:27.172716927 +0000 UTC m=+0.081543808 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:34:27 np0005592767 nova_compute[182623]: 2026-01-22 22:34:27.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:34:27 np0005592767 nova_compute[182623]: 2026-01-22 22:34:27.914 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:34:27 np0005592767 nova_compute[182623]: 2026-01-22 22:34:27.922 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:27 np0005592767 nova_compute[182623]: 2026-01-22 22:34:27.946 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:34:31 np0005592767 nova_compute[182623]: 2026-01-22 22:34:31.452 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:31 np0005592767 nova_compute[182623]: 2026-01-22 22:34:31.605 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:33 np0005592767 nova_compute[182623]: 2026-01-22 22:34:33.545 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121258.5385325, a7098cc6-e436-4068-b543-d5fc63321c86 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:33 np0005592767 nova_compute[182623]: 2026-01-22 22:34:33.545 182627 INFO nova.compute.manager [-] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:34:33 np0005592767 nova_compute[182623]: 2026-01-22 22:34:33.570 182627 DEBUG nova.compute.manager [None req-5ca03a5d-f50f-4b64-8ca7-d9eeea1b2efc - - - - - -] [instance: a7098cc6-e436-4068-b543-d5fc63321c86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:35 np0005592767 podman[226043]: 2026-01-22 22:34:35.123109176 +0000 UTC m=+0.043421279 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.204 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.205 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.226 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.358 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.358 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.370 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.370 182627 INFO nova.compute.claims [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.453 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.501 182627 DEBUG nova.compute.provider_tree [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.514 182627 DEBUG nova.scheduler.client.report [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.536 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.537 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.585 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121261.5811236, 2cf209bf-cc1d-4f9f-953d-c73d3a446160 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.586 182627 INFO nova.compute.manager [-] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.589 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.607 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.612 182627 INFO nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.619 182627 DEBUG nova.compute.manager [None req-a6dcd567-4486-4ec4-81e8-24425bde47fb - - - - - -] [instance: 2cf209bf-cc1d-4f9f-953d-c73d3a446160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.625 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.750 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.751 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.751 182627 INFO nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Creating image(s)#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.752 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.752 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.753 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.770 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.853 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.855 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.855 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.873 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.940 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.942 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.991 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.993 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:36 np0005592767 nova_compute[182623]: 2026-01-22 22:34:36.993 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.064 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "67f705e6-627f-4055-8749-788689f48f24" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.065 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.075 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.075 182627 DEBUG nova.virt.disk.api [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Checking if we can resize image /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.076 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.100 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.158 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.159 182627 DEBUG nova.virt.disk.api [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Cannot resize image /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.160 182627 DEBUG nova.objects.instance [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'migration_context' on Instance uuid f3948f2b-55e6-406b-b2bb-efff1b3a0c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.183 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.184 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Ensure instance console log exists: /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.185 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.186 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.186 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.189 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.201 182627 WARNING nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.217 182627 DEBUG nova.virt.libvirt.host [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.219 182627 DEBUG nova.virt.libvirt.host [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.223 182627 DEBUG nova.virt.libvirt.host [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.224 182627 DEBUG nova.virt.libvirt.host [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.226 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.226 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.227 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.228 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.228 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.229 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.229 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.229 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.230 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.230 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.231 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.231 182627 DEBUG nova.virt.hardware [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.238 182627 DEBUG nova.objects.instance [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'pci_devices' on Instance uuid f3948f2b-55e6-406b-b2bb-efff1b3a0c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.241 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.241 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.248 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.248 182627 INFO nova.compute.claims [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.253 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <uuid>f3948f2b-55e6-406b-b2bb-efff1b3a0c75</uuid>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <name>instance-0000006a</name>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerShowV247Test-server-651666016</nova:name>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:34:37</nova:creationTime>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:user uuid="ce5eb942050f4de7bfa9fb638e005a9a">tempest-ServerShowV247Test-2071373163-project-member</nova:user>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:        <nova:project uuid="927287947032499999e5d3ddc90dfdae">tempest-ServerShowV247Test-2071373163</nova:project>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="serial">f3948f2b-55e6-406b-b2bb-efff1b3a0c75</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="uuid">f3948f2b-55e6-406b-b2bb-efff1b3a0c75</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.config"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/console.log" append="off"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:34:37 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:34:37 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:34:37 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:34:37 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.315 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.315 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.316 182627 INFO nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Using config drive#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.396 182627 DEBUG nova.compute.provider_tree [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.412 182627 DEBUG nova.scheduler.client.report [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.433 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.434 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.487 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.505 182627 INFO nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.521 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.611 182627 INFO nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Creating config drive at /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.config#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.618 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwyr24e4c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.652 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.655 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.656 182627 INFO nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating image(s)#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.657 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.659 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.661 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.681 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.764 182627 DEBUG oslo_concurrency.processutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwyr24e4c" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.778 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.779 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.779 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.789 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.845 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.847 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 systemd-machined[153912]: New machine qemu-52-instance-0000006a.
Jan 22 17:34:37 np0005592767 systemd[1]: Started Virtual Machine qemu-52-instance-0000006a.
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.886 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.889 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.891 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.952 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.953 182627 DEBUG nova.virt.disk.api [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Checking if we can resize image /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:34:37 np0005592767 nova_compute[182623]: 2026-01-22 22:34:37.954 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.009 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.010 182627 DEBUG nova.virt.disk.api [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Cannot resize image /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.011 182627 DEBUG nova.objects.instance [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'migration_context' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.025 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.025 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Ensure instance console log exists: /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.026 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.026 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.027 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.029 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.034 182627 WARNING nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.040 182627 DEBUG nova.virt.libvirt.host [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.041 182627 DEBUG nova.virt.libvirt.host [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.044 182627 DEBUG nova.virt.libvirt.host [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.044 182627 DEBUG nova.virt.libvirt.host [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.045 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.046 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.047 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.047 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.047 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.048 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.048 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.048 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.049 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.049 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.050 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.050 182627 DEBUG nova.virt.hardware [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.056 182627 DEBUG nova.objects.instance [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'pci_devices' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.071 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <uuid>67f705e6-627f-4055-8749-788689f48f24</uuid>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <name>instance-0000006b</name>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerShowV247Test-server-2141912608</nova:name>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:34:38</nova:creationTime>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:user uuid="ce5eb942050f4de7bfa9fb638e005a9a">tempest-ServerShowV247Test-2071373163-project-member</nova:user>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:        <nova:project uuid="927287947032499999e5d3ddc90dfdae">tempest-ServerShowV247Test-2071373163</nova:project>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="serial">67f705e6-627f-4055-8749-788689f48f24</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="uuid">67f705e6-627f-4055-8749-788689f48f24</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/console.log" append="off"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:34:38 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:34:38 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:34:38 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:34:38 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.131 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.132 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.132 182627 INFO nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Using config drive#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.168 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121278.166903, f3948f2b-55e6-406b-b2bb-efff1b3a0c75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.169 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.170 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.171 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.177 182627 INFO nova.virt.libvirt.driver [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Instance spawned successfully.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.177 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.197 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.202 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.205 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.205 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.206 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.206 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.206 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.207 182627 DEBUG nova.virt.libvirt.driver [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.232 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.233 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121278.1679485, f3948f2b-55e6-406b-b2bb-efff1b3a0c75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.233 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] VM Started (Lifecycle Event)#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.258 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.262 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.283 182627 INFO nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Took 1.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.283 182627 DEBUG nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.292 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.353 182627 INFO nova.compute.manager [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Took 2.04 seconds to build instance.#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.369 182627 DEBUG oslo_concurrency.lockutils [None req-d434e9fe-d349-4f50-84d1-b0f29ecab9bc ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.432 182627 INFO nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating config drive at /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.438 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qmc0mry execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:38 np0005592767 nova_compute[182623]: 2026-01-22 22:34:38.574 182627 DEBUG oslo_concurrency.processutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qmc0mry" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:38 np0005592767 systemd-machined[153912]: New machine qemu-53-instance-0000006b.
Jan 22 17:34:38 np0005592767 systemd[1]: Started Virtual Machine qemu-53-instance-0000006b.
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.542 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.545 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.546 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121279.541786, 67f705e6-627f-4055-8749-788689f48f24 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.546 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.560 182627 INFO nova.virt.libvirt.driver [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance spawned successfully.#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.562 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.571 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.580 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.598 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.599 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.600 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.601 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.602 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.604 182627 DEBUG nova.virt.libvirt.driver [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.611 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.612 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121279.543538, 67f705e6-627f-4055-8749-788689f48f24 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.612 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] VM Started (Lifecycle Event)#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.653 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.658 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.677 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.697 182627 INFO nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Took 2.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.698 182627 DEBUG nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.804 182627 INFO nova.compute.manager [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Took 2.61 seconds to build instance.#033[00m
Jan 22 17:34:39 np0005592767 nova_compute[182623]: 2026-01-22 22:34:39.827 182627 DEBUG oslo_concurrency.lockutils [None req-d14d7f2e-b05f-4147-941a-fac95060a40d ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:41 np0005592767 nova_compute[182623]: 2026-01-22 22:34:41.456 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:41 np0005592767 nova_compute[182623]: 2026-01-22 22:34:41.608 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.129 182627 INFO nova.compute.manager [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Rebuilding instance#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.469 182627 DEBUG nova.compute.manager [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.547 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'pci_requests' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.559 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'pci_devices' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.571 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'resources' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.585 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'migration_context' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.597 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:34:42 np0005592767 nova_compute[182623]: 2026-01-22 22:34:42.603 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:34:46 np0005592767 podman[226152]: 2026-01-22 22:34:46.205752554 +0000 UTC m=+0.109318554 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:34:46 np0005592767 nova_compute[182623]: 2026-01-22 22:34:46.457 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:46 np0005592767 nova_compute[182623]: 2026-01-22 22:34:46.610 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:51 np0005592767 nova_compute[182623]: 2026-01-22 22:34:51.459 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:51 np0005592767 nova_compute[182623]: 2026-01-22 22:34:51.612 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:52 np0005592767 podman[226204]: 2026-01-22 22:34:52.18610196 +0000 UTC m=+0.084520901 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Jan 22 17:34:52 np0005592767 podman[226203]: 2026-01-22 22:34:52.233712897 +0000 UTC m=+0.129817103 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:34:52 np0005592767 nova_compute[182623]: 2026-01-22 22:34:52.655 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:34:54 np0005592767 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 22 17:34:54 np0005592767 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006b.scope: Consumed 13.048s CPU time.
Jan 22 17:34:54 np0005592767 systemd-machined[153912]: Machine qemu-53-instance-0000006b terminated.
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.674 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.680 182627 INFO nova.virt.libvirt.driver [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance destroyed successfully.#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.685 182627 INFO nova.virt.libvirt.driver [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance destroyed successfully.#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.686 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Deleting instance files /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24_del#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.687 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Deletion of /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24_del complete#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.888 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.889 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating image(s)#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.890 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.891 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.892 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:55 np0005592767 nova_compute[182623]: 2026-01-22 22:34:55.921 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.003 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.005 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.006 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.025 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.086 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.088 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.134 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.136 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.136 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.211 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.213 182627 DEBUG nova.virt.disk.api [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Checking if we can resize image /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.214 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.276 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.278 182627 DEBUG nova.virt.disk.api [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Cannot resize image /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.279 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.280 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Ensure instance console log exists: /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.281 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.281 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.282 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.285 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.291 182627 WARNING nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.300 182627 DEBUG nova.virt.libvirt.host [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.301 182627 DEBUG nova.virt.libvirt.host [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.305 182627 DEBUG nova.virt.libvirt.host [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.305 182627 DEBUG nova.virt.libvirt.host [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.307 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.307 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.308 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.308 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.308 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.308 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.309 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.309 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.309 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.309 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.309 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.310 182627 DEBUG nova.virt.hardware [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.310 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'vcpu_model' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.333 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <uuid>67f705e6-627f-4055-8749-788689f48f24</uuid>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <name>instance-0000006b</name>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerShowV247Test-server-2141912608</nova:name>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:34:56</nova:creationTime>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:user uuid="ce5eb942050f4de7bfa9fb638e005a9a">tempest-ServerShowV247Test-2071373163-project-member</nova:user>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:        <nova:project uuid="927287947032499999e5d3ddc90dfdae">tempest-ServerShowV247Test-2071373163</nova:project>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="serial">67f705e6-627f-4055-8749-788689f48f24</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="uuid">67f705e6-627f-4055-8749-788689f48f24</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/console.log" append="off"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:34:56 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:34:56 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:34:56 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:34:56 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.388 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.388 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.389 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Using config drive#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.406 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'ec2_ids' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.441 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'keypairs' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.506 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.614 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.713 182627 INFO nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Creating config drive at /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.724 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwfcecl8l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:34:56 np0005592767 nova_compute[182623]: 2026-01-22 22:34:56.866 182627 DEBUG oslo_concurrency.processutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwfcecl8l" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:34:56 np0005592767 systemd-machined[153912]: New machine qemu-54-instance-0000006b.
Jan 22 17:34:56 np0005592767 systemd[1]: Started Virtual Machine qemu-54-instance-0000006b.
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.734 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 67f705e6-627f-4055-8749-788689f48f24 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.735 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121297.7330544, 67f705e6-627f-4055-8749-788689f48f24 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.735 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.738 182627 DEBUG nova.compute.manager [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.739 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.743 182627 INFO nova.virt.libvirt.driver [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance spawned successfully.#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.743 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.797 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.800 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.801 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.801 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.802 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.802 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.803 182627 DEBUG nova.virt.libvirt.driver [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.808 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.852 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.853 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121297.7378018, 67f705e6-627f-4055-8749-788689f48f24 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.853 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] VM Started (Lifecycle Event)#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.882 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.885 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.911 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:34:57 np0005592767 nova_compute[182623]: 2026-01-22 22:34:57.932 182627 DEBUG nova.compute.manager [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:34:58 np0005592767 nova_compute[182623]: 2026-01-22 22:34:58.012 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:34:58 np0005592767 nova_compute[182623]: 2026-01-22 22:34:58.012 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:34:58 np0005592767 nova_compute[182623]: 2026-01-22 22:34:58.012 182627 DEBUG nova.objects.instance [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:34:58 np0005592767 podman[226303]: 2026-01-22 22:34:58.12907803 +0000 UTC m=+0.046889137 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:34:58 np0005592767 podman[226302]: 2026-01-22 22:34:58.129875933 +0000 UTC m=+0.052415224 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:34:58 np0005592767 nova_compute[182623]: 2026-01-22 22:34:58.235 182627 DEBUG oslo_concurrency.lockutils [None req-0f44d814-f5b1-479c-af32-4d1837480415 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.796 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "67f705e6-627f-4055-8749-788689f48f24" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.797 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.797 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "67f705e6-627f-4055-8749-788689f48f24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.798 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.798 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.813 182627 INFO nova.compute.manager [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Terminating instance#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.824 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "refresh_cache-67f705e6-627f-4055-8749-788689f48f24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.825 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquired lock "refresh_cache-67f705e6-627f-4055-8749-788689f48f24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:00 np0005592767 nova_compute[182623]: 2026-01-22 22:35:00.825 182627 DEBUG nova.network.neutron [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.547 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.615 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.622 182627 DEBUG nova.network.neutron [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.832 182627 DEBUG nova.network.neutron [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.850 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Releasing lock "refresh_cache-67f705e6-627f-4055-8749-788689f48f24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:01 np0005592767 nova_compute[182623]: 2026-01-22 22:35:01.850 182627 DEBUG nova.compute.manager [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:35:01 np0005592767 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 22 17:35:01 np0005592767 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006b.scope: Consumed 4.800s CPU time.
Jan 22 17:35:01 np0005592767 systemd-machined[153912]: Machine qemu-54-instance-0000006b terminated.
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.108 182627 INFO nova.virt.libvirt.driver [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance destroyed successfully.#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.109 182627 DEBUG nova.objects.instance [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'resources' on Instance uuid 67f705e6-627f-4055-8749-788689f48f24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.122 182627 INFO nova.virt.libvirt.driver [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Deleting instance files /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24_del#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.123 182627 INFO nova.virt.libvirt.driver [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Deletion of /var/lib/nova/instances/67f705e6-627f-4055-8749-788689f48f24_del complete#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.185 182627 INFO nova.compute.manager [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: 67f705e6-627f-4055-8749-788689f48f24] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.187 182627 DEBUG oslo.service.loopingcall [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.187 182627 DEBUG nova.compute.manager [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.188 182627 DEBUG nova.network.neutron [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.619 182627 DEBUG nova.network.neutron [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.633 182627 DEBUG nova.network.neutron [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.648 182627 INFO nova.compute.manager [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] Took 0.46 seconds to deallocate network for instance.#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.723 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.724 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.811 182627 DEBUG nova.compute.provider_tree [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.846 182627 DEBUG nova.scheduler.client.report [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.873 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.901 182627 INFO nova.scheduler.client.report [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Deleted allocations for instance 67f705e6-627f-4055-8749-788689f48f24#033[00m
Jan 22 17:35:02 np0005592767 nova_compute[182623]: 2026-01-22 22:35:02.984 182627 DEBUG oslo_concurrency.lockutils [None req-584d4f64-58dd-438e-9597-e4825f1b88e1 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "67f705e6-627f-4055-8749-788689f48f24" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.282 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.282 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.283 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.283 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.284 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.298 182627 INFO nova.compute.manager [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Terminating instance#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.313 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "refresh_cache-f3948f2b-55e6-406b-b2bb-efff1b3a0c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.314 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquired lock "refresh_cache-f3948f2b-55e6-406b-b2bb-efff1b3a0c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.314 182627 DEBUG nova.network.neutron [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.502 182627 DEBUG nova.network.neutron [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.624 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:04.623 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:04.625 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.853 182627 DEBUG nova.network.neutron [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.871 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Releasing lock "refresh_cache-f3948f2b-55e6-406b-b2bb-efff1b3a0c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:04 np0005592767 nova_compute[182623]: 2026-01-22 22:35:04.872 182627 DEBUG nova.compute.manager [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:35:04 np0005592767 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 22 17:35:04 np0005592767 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006a.scope: Consumed 12.988s CPU time.
Jan 22 17:35:04 np0005592767 systemd-machined[153912]: Machine qemu-52-instance-0000006a terminated.
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.108 182627 INFO nova.virt.libvirt.driver [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Instance destroyed successfully.#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.108 182627 DEBUG nova.objects.instance [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lazy-loading 'resources' on Instance uuid f3948f2b-55e6-406b-b2bb-efff1b3a0c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.120 182627 INFO nova.virt.libvirt.driver [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Deleting instance files /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75_del#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.120 182627 INFO nova.virt.libvirt.driver [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Deletion of /var/lib/nova/instances/f3948f2b-55e6-406b-b2bb-efff1b3a0c75_del complete#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.191 182627 INFO nova.compute.manager [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.192 182627 DEBUG oslo.service.loopingcall [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.193 182627 DEBUG nova.compute.manager [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.193 182627 DEBUG nova.network.neutron [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.634 182627 DEBUG nova.network.neutron [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.647 182627 DEBUG nova.network.neutron [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.660 182627 INFO nova.compute.manager [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Took 0.47 seconds to deallocate network for instance.#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.722 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.723 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.776 182627 DEBUG nova.compute.provider_tree [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.803 182627 DEBUG nova.scheduler.client.report [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.830 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.861 182627 INFO nova.scheduler.client.report [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Deleted allocations for instance f3948f2b-55e6-406b-b2bb-efff1b3a0c75#033[00m
Jan 22 17:35:05 np0005592767 nova_compute[182623]: 2026-01-22 22:35:05.976 182627 DEBUG oslo_concurrency.lockutils [None req-8f4c0006-5288-4b77-a8a8-3b76d0f68dd8 ce5eb942050f4de7bfa9fb638e005a9a 927287947032499999e5d3ddc90dfdae - - default default] Lock "f3948f2b-55e6-406b-b2bb-efff1b3a0c75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:06 np0005592767 podman[226359]: 2026-01-22 22:35:06.165595716 +0000 UTC m=+0.080695764 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:35:06 np0005592767 nova_compute[182623]: 2026-01-22 22:35:06.555 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:06 np0005592767 nova_compute[182623]: 2026-01-22 22:35:06.617 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:09.628 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:11 np0005592767 nova_compute[182623]: 2026-01-22 22:35:11.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:11 np0005592767 nova_compute[182623]: 2026-01-22 22:35:11.673 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:12.108 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:12.108 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:12.109 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:14Z|00402|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 17:35:16 np0005592767 nova_compute[182623]: 2026-01-22 22:35:16.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:16 np0005592767 nova_compute[182623]: 2026-01-22 22:35:16.676 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:17 np0005592767 nova_compute[182623]: 2026-01-22 22:35:17.106 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121302.1049328, 67f705e6-627f-4055-8749-788689f48f24 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:17 np0005592767 nova_compute[182623]: 2026-01-22 22:35:17.106 182627 INFO nova.compute.manager [-] [instance: 67f705e6-627f-4055-8749-788689f48f24] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:35:17 np0005592767 podman[226383]: 2026-01-22 22:35:17.14241271 +0000 UTC m=+0.063765955 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:35:19 np0005592767 nova_compute[182623]: 2026-01-22 22:35:19.479 182627 DEBUG nova.compute.manager [None req-80c303f5-523b-43b9-9c89-7d1cca0fbe5a - - - - - -] [instance: 67f705e6-627f-4055-8749-788689f48f24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:19 np0005592767 nova_compute[182623]: 2026-01-22 22:35:19.929 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:19 np0005592767 nova_compute[182623]: 2026-01-22 22:35:19.930 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:20 np0005592767 nova_compute[182623]: 2026-01-22 22:35:20.106 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121305.1052792, f3948f2b-55e6-406b-b2bb-efff1b3a0c75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:20 np0005592767 nova_compute[182623]: 2026-01-22 22:35:20.106 182627 INFO nova.compute.manager [-] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:35:20 np0005592767 nova_compute[182623]: 2026-01-22 22:35:20.144 182627 DEBUG nova.compute.manager [None req-f81857c7-c631-421c-ab7d-74049bfd2e80 - - - - - -] [instance: f3948f2b-55e6-406b-b2bb-efff1b3a0c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:21 np0005592767 nova_compute[182623]: 2026-01-22 22:35:21.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:21 np0005592767 nova_compute[182623]: 2026-01-22 22:35:21.677 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:21 np0005592767 nova_compute[182623]: 2026-01-22 22:35:21.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:21 np0005592767 nova_compute[182623]: 2026-01-22 22:35:21.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:35:21 np0005592767 nova_compute[182623]: 2026-01-22 22:35:21.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:35:22 np0005592767 nova_compute[182623]: 2026-01-22 22:35:22.001 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:35:22 np0005592767 nova_compute[182623]: 2026-01-22 22:35:22.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:22 np0005592767 nova_compute[182623]: 2026-01-22 22:35:22.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:23 np0005592767 podman[226404]: 2026-01-22 22:35:23.133029516 +0000 UTC m=+0.052350472 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:35:23 np0005592767 podman[226403]: 2026-01-22 22:35:23.155255385 +0000 UTC m=+0.078263196 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:35:23 np0005592767 nova_compute[182623]: 2026-01-22 22:35:23.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:23 np0005592767 nova_compute[182623]: 2026-01-22 22:35:23.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:35:23 np0005592767 nova_compute[182623]: 2026-01-22 22:35:23.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.193 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.194 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.194 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.195 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.409 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.410 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5718MB free_disk=73.23089599609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.411 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:25 np0005592767 nova_compute[182623]: 2026-01-22 22:35:25.411 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.451 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.452 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.481 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.495 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.518 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.518 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.564 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:26 np0005592767 nova_compute[182623]: 2026-01-22 22:35:26.729 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:28 np0005592767 nova_compute[182623]: 2026-01-22 22:35:28.518 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:28 np0005592767 nova_compute[182623]: 2026-01-22 22:35:28.518 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:28 np0005592767 nova_compute[182623]: 2026-01-22 22:35:28.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:35:29 np0005592767 podman[226452]: 2026-01-22 22:35:29.124988101 +0000 UTC m=+0.047430783 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:35:29 np0005592767 podman[226453]: 2026-01-22 22:35:29.125364102 +0000 UTC m=+0.045977102 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:35:31 np0005592767 nova_compute[182623]: 2026-01-22 22:35:31.569 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:31 np0005592767 nova_compute[182623]: 2026-01-22 22:35:31.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:32 np0005592767 nova_compute[182623]: 2026-01-22 22:35:32.821 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:32 np0005592767 nova_compute[182623]: 2026-01-22 22:35:32.821 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:32 np0005592767 nova_compute[182623]: 2026-01-22 22:35:32.877 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.037 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.038 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.048 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.049 182627 INFO nova.compute.claims [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.093 182627 DEBUG nova.compute.manager [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.238 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.285 182627 DEBUG nova.compute.provider_tree [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.308 182627 DEBUG nova.scheduler.client.report [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.329 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.330 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.333 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.368 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.385 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.385 182627 INFO nova.compute.claims [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.386 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'resources' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.399 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.403 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.403 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.412 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.423 182627 INFO nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.470 182627 INFO nova.compute.resource_tracker [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating resource usage from migration 92d59f76-0b0b-485c-a590-0deb5815dec2#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.470 182627 DEBUG nova.compute.resource_tracker [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Starting to track incoming migration 92d59f76-0b0b-485c-a590-0deb5815dec2 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.473 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.578 182627 DEBUG nova.compute.provider_tree [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.609 182627 DEBUG nova.scheduler.client.report [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.642 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.644 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.645 182627 INFO nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Creating image(s)#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.646 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.646 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.647 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.674 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.675 182627 INFO nova.compute.manager [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Migrating#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.685 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.747 182627 DEBUG nova.policy [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.761 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.762 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.762 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.773 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.825 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.826 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.862 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.863 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.863 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.914 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.915 182627 DEBUG nova.virt.disk.api [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Checking if we can resize image /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:35:33 np0005592767 nova_compute[182623]: 2026-01-22 22:35:33.915 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.005 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.006 182627 DEBUG nova.virt.disk.api [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Cannot resize image /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.007 182627 DEBUG nova.objects.instance [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'migration_context' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.024 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.025 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Ensure instance console log exists: /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.025 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.026 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:34 np0005592767 nova_compute[182623]: 2026-01-22 22:35:34.026 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:35 np0005592767 nova_compute[182623]: 2026-01-22 22:35:35.343 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Successfully created port: 4b5c1570-4e54-4f2b-a349-702a4160e13a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:35:36 np0005592767 nova_compute[182623]: 2026-01-22 22:35:36.574 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:36 np0005592767 nova_compute[182623]: 2026-01-22 22:35:36.771 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.025 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Successfully updated port: 4b5c1570-4e54-4f2b-a349-702a4160e13a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.050 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.051 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.051 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:37 np0005592767 podman[226508]: 2026-01-22 22:35:37.122011171 +0000 UTC m=+0.044733487 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.131 182627 DEBUG nova.compute.manager [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-changed-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.131 182627 DEBUG nova.compute.manager [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Refreshing instance network info cache due to event network-changed-4b5c1570-4e54-4f2b-a349-702a4160e13a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.131 182627 DEBUG oslo_concurrency.lockutils [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:37 np0005592767 nova_compute[182623]: 2026-01-22 22:35:37.263 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:35:37 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:35:37 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:35:37 np0005592767 systemd-logind[802]: New session 59 of user nova.
Jan 22 17:35:37 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:35:37 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:35:37 np0005592767 systemd[226536]: Queued start job for default target Main User Target.
Jan 22 17:35:37 np0005592767 systemd[226536]: Created slice User Application Slice.
Jan 22 17:35:37 np0005592767 systemd[226536]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:35:37 np0005592767 systemd[226536]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:35:37 np0005592767 systemd[226536]: Reached target Paths.
Jan 22 17:35:37 np0005592767 systemd[226536]: Reached target Timers.
Jan 22 17:35:37 np0005592767 systemd[226536]: Starting D-Bus User Message Bus Socket...
Jan 22 17:35:37 np0005592767 systemd[226536]: Starting Create User's Volatile Files and Directories...
Jan 22 17:35:37 np0005592767 systemd[226536]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:35:37 np0005592767 systemd[226536]: Reached target Sockets.
Jan 22 17:35:37 np0005592767 systemd[226536]: Finished Create User's Volatile Files and Directories.
Jan 22 17:35:37 np0005592767 systemd[226536]: Reached target Basic System.
Jan 22 17:35:37 np0005592767 systemd[226536]: Reached target Main User Target.
Jan 22 17:35:37 np0005592767 systemd[226536]: Startup finished in 144ms.
Jan 22 17:35:37 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:35:37 np0005592767 systemd[1]: Started Session 59 of User nova.
Jan 22 17:35:37 np0005592767 systemd[1]: session-59.scope: Deactivated successfully.
Jan 22 17:35:37 np0005592767 systemd-logind[802]: Session 59 logged out. Waiting for processes to exit.
Jan 22 17:35:37 np0005592767 systemd-logind[802]: Removed session 59.
Jan 22 17:35:37 np0005592767 systemd-logind[802]: New session 61 of user nova.
Jan 22 17:35:37 np0005592767 systemd[1]: Started Session 61 of User nova.
Jan 22 17:35:37 np0005592767 systemd[1]: session-61.scope: Deactivated successfully.
Jan 22 17:35:37 np0005592767 systemd-logind[802]: Session 61 logged out. Waiting for processes to exit.
Jan 22 17:35:37 np0005592767 systemd-logind[802]: Removed session 61.
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.859 182627 DEBUG nova.network.neutron [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updating instance_info_cache with network_info: [{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.890 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.890 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance network_info: |[{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.891 182627 DEBUG oslo_concurrency.lockutils [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.891 182627 DEBUG nova.network.neutron [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Refreshing network info cache for port 4b5c1570-4e54-4f2b-a349-702a4160e13a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.895 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start _get_guest_xml network_info=[{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.903 182627 WARNING nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.914 182627 DEBUG nova.virt.libvirt.host [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.915 182627 DEBUG nova.virt.libvirt.host [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.925 182627 DEBUG nova.virt.libvirt.host [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.926 182627 DEBUG nova.virt.libvirt.host [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.928 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.928 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.929 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.929 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.929 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.929 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.930 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.930 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.930 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.930 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.931 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.931 182627 DEBUG nova.virt.hardware [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.936 182627 DEBUG nova.virt.libvirt.vif [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1138865448',display_name='tempest-ServerStableDeviceRescueTest-server-1138865448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1138865448',id=110,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-6arbga1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:33Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=f160acde-2aa8-4109-94ea-ba98aaf63ad3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.936 182627 DEBUG nova.network.os_vif_util [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.937 182627 DEBUG nova.network.os_vif_util [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.938 182627 DEBUG nova.objects.instance [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'pci_devices' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.963 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <uuid>f160acde-2aa8-4109-94ea-ba98aaf63ad3</uuid>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <name>instance-0000006e</name>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1138865448</nova:name>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:35:38</nova:creationTime>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:user uuid="9d1e26d3056148e692e157703469d77a">tempest-ServerStableDeviceRescueTest-395714292-project-member</nova:user>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:project uuid="9b1f07a8546648baba916fffc53a0b93">tempest-ServerStableDeviceRescueTest-395714292</nova:project>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        <nova:port uuid="4b5c1570-4e54-4f2b-a349-702a4160e13a">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="serial">f160acde-2aa8-4109-94ea-ba98aaf63ad3</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="uuid">f160acde-2aa8-4109-94ea-ba98aaf63ad3</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b1:99:38"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <target dev="tap4b5c1570-4e"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/console.log" append="off"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:35:38 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:35:38 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:35:38 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:35:38 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.963 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Preparing to wait for external event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.964 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.964 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.964 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.965 182627 DEBUG nova.virt.libvirt.vif [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:35:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1138865448',display_name='tempest-ServerStableDeviceRescueTest-server-1138865448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1138865448',id=110,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-6arbga1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:33Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=f160acde-2aa8-4109-94ea-ba98aaf63ad3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.965 182627 DEBUG nova.network.os_vif_util [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.966 182627 DEBUG nova.network.os_vif_util [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.966 182627 DEBUG os_vif [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.966 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.967 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.967 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.972 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.972 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b5c1570-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.973 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b5c1570-4e, col_values=(('external_ids', {'iface-id': '4b5c1570-4e54-4f2b-a349-702a4160e13a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:99:38', 'vm-uuid': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.975 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:38 np0005592767 NetworkManager[54973]: <info>  [1769121338.9772] manager: (tap4b5c1570-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.978 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.983 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:38 np0005592767 nova_compute[182623]: 2026-01-22 22:35:38.984 182627 INFO os_vif [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e')#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.034 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.034 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.035 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No VIF found with MAC fa:16:3e:b1:99:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.035 182627 INFO nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Using config drive#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.769 182627 INFO nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Creating config drive at /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.777 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4929s0o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:39 np0005592767 nova_compute[182623]: 2026-01-22 22:35:39.920 182627 DEBUG oslo_concurrency.processutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4929s0o" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:39 np0005592767 kernel: tap4b5c1570-4e: entered promiscuous mode
Jan 22 17:35:39 np0005592767 NetworkManager[54973]: <info>  [1769121339.9909] manager: (tap4b5c1570-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/190)
Jan 22 17:35:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:40Z|00403|binding|INFO|Claiming lport 4b5c1570-4e54-4f2b-a349-702a4160e13a for this chassis.
Jan 22 17:35:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:40Z|00404|binding|INFO|4b5c1570-4e54-4f2b-a349-702a4160e13a: Claiming fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.023 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.031 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 systemd-udevd[226576]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:40 np0005592767 NetworkManager[54973]: <info>  [1769121340.0596] device (tap4b5c1570-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:35:40 np0005592767 NetworkManager[54973]: <info>  [1769121340.0602] device (tap4b5c1570-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.074 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.076 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 bound to our chassis#033[00m
Jan 22 17:35:40 np0005592767 systemd-machined[153912]: New machine qemu-55-instance-0000006e.
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.078 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.090 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[664cf24c-1172-47be-8fc4-7be81be13b12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.091 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.094 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.094 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0f279a-1093-4505-9948-9eacddf08e78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.095 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2de33a5e-ce77-4202-bd5a-38329b1b6d57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 systemd[1]: Started Virtual Machine qemu-55-instance-0000006e.
Jan 22 17:35:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:40Z|00405|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a ovn-installed in OVS
Jan 22 17:35:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:40Z|00406|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a up in Southbound
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.109 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6f1cad-1d33-4f76-9099-af0eb2226e76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.122 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8de7cfde-e3fd-4643-a5db-323372421b42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.152 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[997812eb-0466-4454-9e3e-4e9c6ed3fa58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.158 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1603cd40-0c6d-4991-9392-f4fd2dfea6fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 systemd-udevd[226580]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:40 np0005592767 NetworkManager[54973]: <info>  [1769121340.1599] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/191)
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.191 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c673efe5-f223-4f56-8d8d-86d2748bfb08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.196 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0c81894a-328c-4d26-8e61-4c1f69cbbe59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 NetworkManager[54973]: <info>  [1769121340.2240] device (tapad2345e3-00): carrier: link connected
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.228 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[945fcc9a-23bf-41b2-a889-917107b83eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.243 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a4cff773-c07e-463f-85a2-842d283496f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493680, 'reachable_time': 23053, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226612, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.257 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[238db62e-2eb0-432b-bb9c-45b6b462f370]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493680, 'tstamp': 493680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226615, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.272 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7b4d570b-b1fa-49ed-a50e-d5055c445c4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493680, 'reachable_time': 23053, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226619, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.304 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a84656b8-0bd3-4410-8b63-824f6d456a6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.354 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121340.3535907, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.354 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.357 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[51ceec7b-dadd-4f66-bd1e-d93d4c239b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.359 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.359 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.360 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.361 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 NetworkManager[54973]: <info>  [1769121340.3623] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Jan 22 17:35:40 np0005592767 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.363 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.364 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.365 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:40Z|00407|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.366 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.366 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.367 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[94236a68-d079-4748-a894-9aeffc409f5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.368 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:35:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:40.370 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.376 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.413 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.417 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121340.35781, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.417 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.449 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.452 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.484 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.531 182627 DEBUG nova.compute.manager [req-c0ab3fa7-2082-4738-9edf-18db471518af req-3b74a199-e856-440a-8141-590b0fe30b39 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.532 182627 DEBUG oslo_concurrency.lockutils [req-c0ab3fa7-2082-4738-9edf-18db471518af req-3b74a199-e856-440a-8141-590b0fe30b39 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.532 182627 DEBUG oslo_concurrency.lockutils [req-c0ab3fa7-2082-4738-9edf-18db471518af req-3b74a199-e856-440a-8141-590b0fe30b39 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.532 182627 DEBUG oslo_concurrency.lockutils [req-c0ab3fa7-2082-4738-9edf-18db471518af req-3b74a199-e856-440a-8141-590b0fe30b39 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.532 182627 DEBUG nova.compute.manager [req-c0ab3fa7-2082-4738-9edf-18db471518af req-3b74a199-e856-440a-8141-590b0fe30b39 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Processing event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.533 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.536 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.537 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121340.5374353, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.537 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.540 182627 INFO nova.virt.libvirt.driver [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance spawned successfully.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.540 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.564 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.576 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.579 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.579 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.579 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.580 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.580 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.580 182627 DEBUG nova.virt.libvirt.driver [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.608 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.642 182627 DEBUG nova.compute.manager [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.643 182627 DEBUG oslo_concurrency.lockutils [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.643 182627 DEBUG oslo_concurrency.lockutils [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.643 182627 DEBUG oslo_concurrency.lockutils [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.643 182627 DEBUG nova.compute.manager [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.643 182627 WARNING nova.compute.manager [req-e5edc58d-5c5a-42bf-96dd-bfa20a4b29c5 req-dbbd1691-5b96-44df-80b2-36baae1f70a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.672 182627 INFO nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Took 7.03 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.673 182627 DEBUG nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:40 np0005592767 podman[226652]: 2026-01-22 22:35:40.764493636 +0000 UTC m=+0.054022829 container create 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.767 182627 INFO nova.compute.manager [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Took 7.81 seconds to build instance.#033[00m
Jan 22 17:35:40 np0005592767 nova_compute[182623]: 2026-01-22 22:35:40.797 182627 DEBUG oslo_concurrency.lockutils [None req-90b63251-a0d2-405d-8a69-43ef0fa3ca9a 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:40 np0005592767 systemd[1]: Started libpod-conmon-22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7.scope.
Jan 22 17:35:40 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:35:40 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88f93a804cd01757ba5d73a89cf9f5e4b94ff11206e000d609457a521d17c14e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:35:40 np0005592767 podman[226652]: 2026-01-22 22:35:40.736422622 +0000 UTC m=+0.025951835 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:35:40 np0005592767 podman[226652]: 2026-01-22 22:35:40.845131208 +0000 UTC m=+0.134660421 container init 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:35:40 np0005592767 podman[226652]: 2026-01-22 22:35:40.850286194 +0000 UTC m=+0.139815387 container start 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:35:40 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [NOTICE]   (226671) : New worker (226673) forked
Jan 22 17:35:40 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [NOTICE]   (226671) : Loading success.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: New session 62 of user nova.
Jan 22 17:35:41 np0005592767 systemd[1]: Started Session 62 of User nova.
Jan 22 17:35:41 np0005592767 nova_compute[182623]: 2026-01-22 22:35:41.576 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:41 np0005592767 systemd[1]: session-62.scope: Deactivated successfully.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: Session 62 logged out. Waiting for processes to exit.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: Removed session 62.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: New session 63 of user nova.
Jan 22 17:35:41 np0005592767 nova_compute[182623]: 2026-01-22 22:35:41.885 182627 DEBUG nova.network.neutron [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updated VIF entry in instance network info cache for port 4b5c1570-4e54-4f2b-a349-702a4160e13a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:35:41 np0005592767 nova_compute[182623]: 2026-01-22 22:35:41.885 182627 DEBUG nova.network.neutron [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updating instance_info_cache with network_info: [{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:41 np0005592767 systemd[1]: Started Session 63 of User nova.
Jan 22 17:35:41 np0005592767 nova_compute[182623]: 2026-01-22 22:35:41.907 182627 DEBUG oslo_concurrency.lockutils [req-d3878cc7-c7c3-41ba-b604-d1ce8766baa4 req-f07c6e63-26b0-435f-a17b-90cee97fbf6d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:41 np0005592767 systemd[1]: session-63.scope: Deactivated successfully.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: Session 63 logged out. Waiting for processes to exit.
Jan 22 17:35:41 np0005592767 systemd-logind[802]: Removed session 63.
Jan 22 17:35:42 np0005592767 systemd-logind[802]: New session 64 of user nova.
Jan 22 17:35:42 np0005592767 systemd[1]: Started Session 64 of User nova.
Jan 22 17:35:42 np0005592767 systemd[1]: session-64.scope: Deactivated successfully.
Jan 22 17:35:42 np0005592767 systemd-logind[802]: Session 64 logged out. Waiting for processes to exit.
Jan 22 17:35:42 np0005592767 systemd-logind[802]: Removed session 64.
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.670 182627 DEBUG nova.compute.manager [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.670 182627 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.670 182627 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.671 182627 DEBUG oslo_concurrency.lockutils [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.671 182627 DEBUG nova.compute.manager [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.671 182627 WARNING nova.compute.manager [req-33d41416-fd9b-4cb8-86fb-6b04886e1a01 req-ed96f0c6-ac56-4114-9386-f48bd31be257 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.962 182627 DEBUG nova.compute.manager [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.962 182627 DEBUG oslo_concurrency.lockutils [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.962 182627 DEBUG oslo_concurrency.lockutils [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.963 182627 DEBUG oslo_concurrency.lockutils [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.963 182627 DEBUG nova.compute.manager [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:42 np0005592767 nova_compute[182623]: 2026-01-22 22:35:42.963 182627 WARNING nova.compute.manager [req-b5267b2e-916b-4e23-809c-594d9d093418 req-204c4138-a93a-4e25-b44a-16b36a5cf426 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 22 17:35:43 np0005592767 nova_compute[182623]: 2026-01-22 22:35:43.225 182627 INFO nova.network.neutron [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 22 17:35:43 np0005592767 nova_compute[182623]: 2026-01-22 22:35:43.794 182627 DEBUG nova.compute.manager [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:43 np0005592767 nova_compute[182623]: 2026-01-22 22:35:43.857 182627 INFO nova.compute.manager [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] instance snapshotting#033[00m
Jan 22 17:35:43 np0005592767 nova_compute[182623]: 2026-01-22 22:35:43.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.563 182627 INFO nova.virt.libvirt.driver [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Beginning live snapshot process#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.658 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.659 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquired lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.660 182627 DEBUG nova.network.neutron [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:44 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.870 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.967 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json -f qcow2" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:44 np0005592767 nova_compute[182623]: 2026-01-22 22:35:44.969 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.046 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk --force-share --output=json -f qcow2" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.060 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.113 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.114 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.147 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e.delta 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.149 182627 INFO nova.virt.libvirt.driver [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.209 182627 DEBUG nova.virt.libvirt.guest [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.214 182627 INFO nova.virt.libvirt.driver [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.250 182627 DEBUG nova.compute.manager [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-changed-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.251 182627 DEBUG nova.compute.manager [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Refreshing instance network info cache due to event network-changed-0b3799ee-2c54-4b41-a4fd-6b8596e79125. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.251 182627 DEBUG oslo_concurrency.lockutils [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.259 182627 DEBUG nova.privsep.utils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.260 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e.delta /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.416 182627 DEBUG oslo_concurrency.processutils [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e.delta /var/lib/nova/instances/snapshots/tmpmkcqaz1a/da63b4fe030a44dd9390c7c5c1cc183e" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:45 np0005592767 nova_compute[182623]: 2026-01-22 22:35:45.417 182627 INFO nova.virt.libvirt.driver [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:35:46 np0005592767 nova_compute[182623]: 2026-01-22 22:35:46.579 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.270 182627 DEBUG nova.network.neutron [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating instance_info_cache with network_info: [{"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.310 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Releasing lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.317 182627 DEBUG oslo_concurrency.lockutils [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.318 182627 DEBUG nova.network.neutron [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Refreshing network info cache for port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.462 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.465 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.465 182627 INFO nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Creating image(s)#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.467 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.479 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.550 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.552 182627 DEBUG nova.virt.disk.api [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Checking if we can resize image /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.553 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.646 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.648 182627 DEBUG nova.virt.disk.api [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Cannot resize image /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.665 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.665 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Ensure instance console log exists: /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.666 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.667 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.667 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.672 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Start _get_guest_xml network_info=[{"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--154670647", "vif_mac": "fa:16:3e:d7:b3:21"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.677 182627 WARNING nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.686 182627 DEBUG nova.virt.libvirt.host [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.687 182627 DEBUG nova.virt.libvirt.host [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.691 182627 DEBUG nova.virt.libvirt.host [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.692 182627 DEBUG nova.virt.libvirt.host [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.694 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.694 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.695 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.696 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.696 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.697 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.697 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.697 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.698 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.698 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.699 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.699 182627 DEBUG nova.virt.hardware [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.700 182627 DEBUG nova.objects.instance [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.721 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.785 182627 DEBUG oslo_concurrency.processutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.786 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.787 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.788 182627 DEBUG oslo_concurrency.lockutils [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.789 182627 DEBUG nova.virt.libvirt.vif [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-835067223',display_name='tempest-TestNetworkAdvancedServerOps-server-835067223',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-835067223',id=109,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxykQRXSNI+vSIm+OvqW9Fj8rLp6B2aYNvIPpABIfDgOXpNo2F13/vwM8hDU3IOu3FDDhj5A57STA0vGN4KBedUqz5S0z+W5QUE2jUWvHQHsl24ZGJuBG8cdlli+DZNLg==',key_name='tempest-TestNetworkAdvancedServerOps-1367215877',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-kdmvtsoy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:42Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=5b929866-486a-4348-9787-e2f273dbecc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--154670647", "vif_mac": "fa:16:3e:d7:b3:21"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.790 182627 DEBUG nova.network.os_vif_util [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converting VIF {"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--154670647", "vif_mac": "fa:16:3e:d7:b3:21"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.791 182627 DEBUG nova.network.os_vif_util [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.794 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <uuid>5b929866-486a-4348-9787-e2f273dbecc8</uuid>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <name>instance-0000006d</name>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-835067223</nova:name>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:35:47</nova:creationTime>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        <nova:port uuid="0b3799ee-2c54-4b41-a4fd-6b8596e79125">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="serial">5b929866-486a-4348-9787-e2f273dbecc8</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="uuid">5b929866-486a-4348-9787-e2f273dbecc8</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/disk.config"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:d7:b3:21"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <target dev="tap0b3799ee-2c"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/console.log" append="off"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:35:47 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:35:47 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:35:47 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:35:47 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.796 182627 DEBUG nova.virt.libvirt.vif [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-835067223',display_name='tempest-TestNetworkAdvancedServerOps-server-835067223',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-835067223',id=109,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxykQRXSNI+vSIm+OvqW9Fj8rLp6B2aYNvIPpABIfDgOXpNo2F13/vwM8hDU3IOu3FDDhj5A57STA0vGN4KBedUqz5S0z+W5QUE2jUWvHQHsl24ZGJuBG8cdlli+DZNLg==',key_name='tempest-TestNetworkAdvancedServerOps-1367215877',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:34:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-kdmvtsoy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:42Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=5b929866-486a-4348-9787-e2f273dbecc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--154670647", "vif_mac": "fa:16:3e:d7:b3:21"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.796 182627 DEBUG nova.network.os_vif_util [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converting VIF {"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--154670647", "vif_mac": "fa:16:3e:d7:b3:21"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.796 182627 DEBUG nova.network.os_vif_util [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.797 182627 DEBUG os_vif [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.798 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.798 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.799 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.802 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.802 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b3799ee-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.803 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b3799ee-2c, col_values=(('external_ids', {'iface-id': '0b3799ee-2c54-4b41-a4fd-6b8596e79125', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:b3:21', 'vm-uuid': '5b929866-486a-4348-9787-e2f273dbecc8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.805 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.807 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.812 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.813 182627 INFO os_vif [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c')#033[00m
Jan 22 17:35:47 np0005592767 NetworkManager[54973]: <info>  [1769121347.8195] manager: (tap0b3799ee-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.882 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.883 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.883 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] No VIF found with MAC fa:16:3e:d7:b3:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.884 182627 INFO nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Using config drive#033[00m
Jan 22 17:35:47 np0005592767 kernel: tap0b3799ee-2c: entered promiscuous mode
Jan 22 17:35:47 np0005592767 NetworkManager[54973]: <info>  [1769121347.9505] manager: (tap0b3799ee-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 22 17:35:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:47Z|00408|binding|INFO|Claiming lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 for this chassis.
Jan 22 17:35:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:47Z|00409|binding|INFO|0b3799ee-2c54-4b41-a4fd-6b8596e79125: Claiming fa:16:3e:d7:b3:21 10.100.0.12
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 NetworkManager[54973]: <info>  [1769121347.9728] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Jan 22 17:35:47 np0005592767 NetworkManager[54973]: <info>  [1769121347.9733] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 22 17:35:47 np0005592767 nova_compute[182623]: 2026-01-22 22:35:47.972 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:47.983 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:b3:21 10.100.0.12'], port_security=['fa:16:3e:d7:b3:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5b929866-486a-4348-9787-e2f273dbecc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3e777c20-1be0-45e2-8716-c8375f2870cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31433937-7792-4bbb-b1e0-0b9aaac0b8c0, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0b3799ee-2c54-4b41-a4fd-6b8596e79125) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:47.985 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 in datapath 48ecc0e0-69c0-4e79-a289-0bf82207c044 bound to our chassis#033[00m
Jan 22 17:35:47 np0005592767 systemd-udevd[226757]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:47.989 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48ecc0e0-69c0-4e79-a289-0bf82207c044#033[00m
Jan 22 17:35:48 np0005592767 NetworkManager[54973]: <info>  [1769121348.0016] device (tap0b3799ee-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:35:48 np0005592767 NetworkManager[54973]: <info>  [1769121348.0027] device (tap0b3799ee-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.003 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b70f1f9e-1864-4236-a580-8d6d1cf8cf96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.004 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48ecc0e0-61 in ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.008 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48ecc0e0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.009 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b168f97a-5db1-479d-bd8a-ca6f7b3be4ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.010 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0ef417-9fbb-4f2f-ac27-b2d3350c149f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.030 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a2218b2b-48c6-485e-82c9-2df6bca53d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 systemd-machined[153912]: New machine qemu-56-instance-0000006d.
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8721fec9-dfc9-49f7-9537-8f863996dbcd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.100 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8dafee-4f9a-4d97-b5df-e89365f57e48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 systemd-udevd[226762]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:48 np0005592767 NetworkManager[54973]: <info>  [1769121348.1140] manager: (tap48ecc0e0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/197)
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.113 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[89dc2217-d35c-491e-a834-fc251a6467f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 systemd[1]: Started Virtual Machine qemu-56-instance-0000006d.
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:48Z|00410|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 17:35:48 np0005592767 podman[226746]: 2026-01-22 22:35:48.172145449 +0000 UTC m=+0.212859954 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.175 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:48Z|00411|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 ovn-installed in OVS
Jan 22 17:35:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:48Z|00412|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 up in Southbound
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.185 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.197 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdb0711-eaf4-4c99-b829-50a4301bde33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.202 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ebbf9e-ac03-47dd-9b76-aa6607c29a68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 NetworkManager[54973]: <info>  [1769121348.2294] device (tap48ecc0e0-60): carrier: link connected
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.234 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a02d40f3-55cb-4d8a-a2f6-093dc5237cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.254 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c1bf94-b12e-4493-b7be-b0ed1ff60441]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ecc0e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:99:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494480, 'reachable_time': 42437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226805, 'error': None, 'target': 'ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.270 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[40b41212-bf38-45d7-a3ca-1b590bd02d7d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:993b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494480, 'tstamp': 494480}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226806, 'error': None, 'target': 'ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.299 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[765da8ef-6e8b-4d63-b6e1-17dd6ec67d31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48ecc0e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:99:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494480, 'reachable_time': 42437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226807, 'error': None, 'target': 'ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.339 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6f72c0be-05be-4d33-ba19-29880014ac3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.411 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7b75020d-c34c-45bf-b935-95b5fe722e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.412 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ecc0e0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.413 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.413 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48ecc0e0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.415 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 NetworkManager[54973]: <info>  [1769121348.4161] manager: (tap48ecc0e0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 22 17:35:48 np0005592767 kernel: tap48ecc0e0-60: entered promiscuous mode
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.419 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.420 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48ecc0e0-60, col_values=(('external_ids', {'iface-id': 'acb8223f-3858-4bd9-ac28-62100dce475f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:48Z|00413|binding|INFO|Releasing lport acb8223f-3858-4bd9-ac28-62100dce475f from this chassis (sb_readonly=0)
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.421 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.439 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48ecc0e0-69c0-4e79-a289-0bf82207c044.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48ecc0e0-69c0-4e79-a289-0bf82207c044.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.443 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ece4c6eb-9a00-42a7-ba80-890703c5f491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.444 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-48ecc0e0-69c0-4e79-a289-0bf82207c044
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/48ecc0e0-69c0-4e79-a289-0bf82207c044.pid.haproxy
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 48ecc0e0-69c0-4e79-a289-0bf82207c044
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:35:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:48.444 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'env', 'PROCESS_TAG=haproxy-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48ecc0e0-69c0-4e79-a289-0bf82207c044.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.536 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121348.5364096, 5b929866-486a-4348-9787-e2f273dbecc8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.537 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.540 182627 DEBUG nova.compute.manager [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.545 182627 INFO nova.virt.libvirt.driver [-] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Instance running successfully.#033[00m
Jan 22 17:35:48 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.547 182627 DEBUG nova.virt.libvirt.guest [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.548 182627 DEBUG nova.virt.libvirt.driver [None req-615170b5-7dc2-4505-ba0e-925b23ce7bce f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.570 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.581 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.615 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.620 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121348.540045, 5b929866-486a-4348-9787-e2f273dbecc8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.620 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] VM Started (Lifecycle Event)#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.653 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:35:48 np0005592767 nova_compute[182623]: 2026-01-22 22:35:48.656 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:35:48 np0005592767 podman[226846]: 2026-01-22 22:35:48.88005465 +0000 UTC m=+0.064402594 container create b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:35:48 np0005592767 systemd[1]: Started libpod-conmon-b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb.scope.
Jan 22 17:35:48 np0005592767 podman[226846]: 2026-01-22 22:35:48.852319685 +0000 UTC m=+0.036667669 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:35:48 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:35:48 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a6e39cf98cc0c5558a26b7ae47213e7c1bb3147452ddf01ed3c22b1e21b02ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:35:48 np0005592767 podman[226846]: 2026-01-22 22:35:48.981427648 +0000 UTC m=+0.165775602 container init b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:35:48 np0005592767 podman[226846]: 2026-01-22 22:35:48.986641086 +0000 UTC m=+0.170989030 container start b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:35:49 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [NOTICE]   (226865) : New worker (226867) forked
Jan 22 17:35:49 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [NOTICE]   (226865) : Loading success.
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.068 182627 INFO nova.virt.libvirt.driver [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Snapshot image upload complete#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.068 182627 INFO nova.compute.manager [None req-6e20061b-650b-47fa-a7f3-b55910ebea3f 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Took 5.20 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.925 182627 DEBUG nova.compute.manager [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.926 182627 DEBUG oslo_concurrency.lockutils [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.926 182627 DEBUG oslo_concurrency.lockutils [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.926 182627 DEBUG oslo_concurrency.lockutils [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.927 182627 DEBUG nova.compute.manager [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:49 np0005592767 nova_compute[182623]: 2026-01-22 22:35:49.927 182627 WARNING nova.compute.manager [req-94a402c4-4260-4c10-a2e5-a8c0e8c0b96e req-5fd3e21e-0f1e-41da-984f-24b347ae1bba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state None.#033[00m
Jan 22 17:35:50 np0005592767 nova_compute[182623]: 2026-01-22 22:35:50.701 182627 DEBUG nova.network.neutron [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updated VIF entry in instance network info cache for port 0b3799ee-2c54-4b41-a4fd-6b8596e79125. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:35:50 np0005592767 nova_compute[182623]: 2026-01-22 22:35:50.702 182627 DEBUG nova.network.neutron [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating instance_info_cache with network_info: [{"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:50 np0005592767 nova_compute[182623]: 2026-01-22 22:35:50.723 182627 DEBUG oslo_concurrency.lockutils [req-3c966dfe-195c-42de-8cd8-f3e4102eceea req-f9d81e4b-a940-4141-a057-152301899a4a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:51 np0005592767 nova_compute[182623]: 2026-01-22 22:35:51.318 182627 DEBUG nova.network.neutron [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 22 17:35:51 np0005592767 nova_compute[182623]: 2026-01-22 22:35:51.319 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:51 np0005592767 nova_compute[182623]: 2026-01-22 22:35:51.319 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:51 np0005592767 nova_compute[182623]: 2026-01-22 22:35:51.320 182627 DEBUG nova.network.neutron [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:51 np0005592767 nova_compute[182623]: 2026-01-22 22:35:51.581 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.192 182627 DEBUG nova.compute.manager [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.192 182627 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.193 182627 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.194 182627 DEBUG oslo_concurrency.lockutils [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.194 182627 DEBUG nova.compute.manager [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.194 182627 WARNING nova.compute.manager [req-b6117822-b54b-4082-b871-7619439f4f1e req-0b33e6a8-e0c9-453b-984c-108c42900888 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:52 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:35:52 np0005592767 systemd[226536]: Activating special unit Exit the Session...
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped target Main User Target.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped target Basic System.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped target Paths.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped target Sockets.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped target Timers.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:35:52 np0005592767 systemd[226536]: Closed D-Bus User Message Bus Socket.
Jan 22 17:35:52 np0005592767 systemd[226536]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:35:52 np0005592767 systemd[226536]: Removed slice User Application Slice.
Jan 22 17:35:52 np0005592767 systemd[226536]: Reached target Shutdown.
Jan 22 17:35:52 np0005592767 systemd[226536]: Finished Exit the Session.
Jan 22 17:35:52 np0005592767 systemd[226536]: Reached target Exit the Session.
Jan 22 17:35:52 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:35:52 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:35:52 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:35:52 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:35:52 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:35:52 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:35:52 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.702 182627 INFO nova.compute.manager [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Rescuing#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.704 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.705 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.705 182627 DEBUG nova.network.neutron [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:35:52 np0005592767 nova_compute[182623]: 2026-01-22 22:35:52.806 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:54Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:35:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:54Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:35:54 np0005592767 podman[226892]: 2026-01-22 22:35:54.168626592 +0000 UTC m=+0.081588070 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:35:54 np0005592767 podman[226891]: 2026-01-22 22:35:54.222367702 +0000 UTC m=+0.135819174 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.734 182627 DEBUG nova.network.neutron [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating instance_info_cache with network_info: [{"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.858 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.887 182627 DEBUG nova.virt.libvirt.driver [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Creating tmpfile /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8/tmp4hk3nkbe to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Jan 22 17:35:54 np0005592767 kernel: tap0b3799ee-2c (unregistering): left promiscuous mode
Jan 22 17:35:54 np0005592767 NetworkManager[54973]: <info>  [1769121354.9200] device (tap0b3799ee-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.927 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:54Z|00414|binding|INFO|Releasing lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 from this chassis (sb_readonly=0)
Jan 22 17:35:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:54Z|00415|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 down in Southbound
Jan 22 17:35:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:54Z|00416|binding|INFO|Removing iface tap0b3799ee-2c ovn-installed in OVS
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:54 np0005592767 nova_compute[182623]: 2026-01-22 22:35:54.945 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:54.950 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:b3:21 10.100.0.12'], port_security=['fa:16:3e:d7:b3:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5b929866-486a-4348-9787-e2f273dbecc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e777c20-1be0-45e2-8716-c8375f2870cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31433937-7792-4bbb-b1e0-0b9aaac0b8c0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0b3799ee-2c54-4b41-a4fd-6b8596e79125) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:54.951 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 in datapath 48ecc0e0-69c0-4e79-a289-0bf82207c044 unbound from our chassis#033[00m
Jan 22 17:35:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:54.953 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48ecc0e0-69c0-4e79-a289-0bf82207c044, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:35:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:54.954 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[74b9efaf-28d1-4fbd-9c1c-63f0875057b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:54.955 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044 namespace which is not needed anymore#033[00m
Jan 22 17:35:54 np0005592767 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 22 17:35:54 np0005592767 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000006d.scope: Consumed 6.889s CPU time.
Jan 22 17:35:54 np0005592767 systemd-machined[153912]: Machine qemu-56-instance-0000006d terminated.
Jan 22 17:35:55 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [NOTICE]   (226865) : haproxy version is 2.8.14-c23fe91
Jan 22 17:35:55 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [NOTICE]   (226865) : path to executable is /usr/sbin/haproxy
Jan 22 17:35:55 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [WARNING]  (226865) : Exiting Master process...
Jan 22 17:35:55 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [ALERT]    (226865) : Current worker (226867) exited with code 143 (Terminated)
Jan 22 17:35:55 np0005592767 neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044[226861]: [WARNING]  (226865) : All workers exited. Exiting... (0)
Jan 22 17:35:55 np0005592767 systemd[1]: libpod-b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb.scope: Deactivated successfully.
Jan 22 17:35:55 np0005592767 podman[226966]: 2026-01-22 22:35:55.090828896 +0000 UTC m=+0.048002709 container died b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:35:55 np0005592767 kernel: tap0b3799ee-2c: entered promiscuous mode
Jan 22 17:35:55 np0005592767 NetworkManager[54973]: <info>  [1769121355.1159] manager: (tap0b3799ee-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 systemd-udevd[226944]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00417|binding|INFO|Claiming lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 for this chassis.
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00418|binding|INFO|0b3799ee-2c54-4b41-a4fd-6b8596e79125: Claiming fa:16:3e:d7:b3:21 10.100.0.12
Jan 22 17:35:55 np0005592767 kernel: tap0b3799ee-2c (unregistering): left promiscuous mode
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.127 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:b3:21 10.100.0.12'], port_security=['fa:16:3e:d7:b3:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5b929866-486a-4348-9787-e2f273dbecc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e777c20-1be0-45e2-8716-c8375f2870cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31433937-7792-4bbb-b1e0-0b9aaac0b8c0, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0b3799ee-2c54-4b41-a4fd-6b8596e79125) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb-userdata-shm.mount: Deactivated successfully.
Jan 22 17:35:55 np0005592767 systemd[1]: var-lib-containers-storage-overlay-2a6e39cf98cc0c5558a26b7ae47213e7c1bb3147452ddf01ed3c22b1e21b02ce-merged.mount: Deactivated successfully.
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.142 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00419|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 ovn-installed in OVS
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00420|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 up in Southbound
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00421|binding|INFO|Releasing lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 from this chassis (sb_readonly=1)
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00422|if_status|INFO|Dropped 1 log messages in last 382 seconds (most recently, 382 seconds ago) due to excessive rate
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00423|if_status|INFO|Not setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 down as sb is readonly
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00424|binding|INFO|Removing iface tap0b3799ee-2c ovn-installed in OVS
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 podman[226966]: 2026-01-22 22:35:55.150934897 +0000 UTC m=+0.108108710 container cleanup b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00425|binding|INFO|Releasing lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 from this chassis (sb_readonly=0)
Jan 22 17:35:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:55Z|00426|binding|INFO|Setting lport 0b3799ee-2c54-4b41-a4fd-6b8596e79125 down in Southbound
Jan 22 17:35:55 np0005592767 systemd[1]: libpod-conmon-b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb.scope: Deactivated successfully.
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.165 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:b3:21 10.100.0.12'], port_security=['fa:16:3e:d7:b3:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5b929866-486a-4348-9787-e2f273dbecc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3e777c20-1be0-45e2-8716-c8375f2870cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31433937-7792-4bbb-b1e0-0b9aaac0b8c0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0b3799ee-2c54-4b41-a4fd-6b8596e79125) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.173 182627 INFO nova.virt.libvirt.driver [-] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Instance destroyed successfully.#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.173 182627 DEBUG nova.objects.instance [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.187 182627 DEBUG nova.virt.libvirt.vif [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:34:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-835067223',display_name='tempest-TestNetworkAdvancedServerOps-server-835067223',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-835067223',id=109,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDxykQRXSNI+vSIm+OvqW9Fj8rLp6B2aYNvIPpABIfDgOXpNo2F13/vwM8hDU3IOu3FDDhj5A57STA0vGN4KBedUqz5S0z+W5QUE2jUWvHQHsl24ZGJuBG8cdlli+DZNLg==',key_name='tempest-TestNetworkAdvancedServerOps-1367215877',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-kdmvtsoy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:35:48Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=5b929866-486a-4348-9787-e2f273dbecc8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.188 182627 DEBUG nova.network.os_vif_util [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.188 182627 DEBUG nova.network.os_vif_util [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.188 182627 DEBUG os_vif [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.190 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.190 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b3799ee-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.193 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.195 182627 INFO os_vif [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:b3:21,bridge_name='br-int',has_traffic_filtering=True,id=0b3799ee-2c54-4b41-a4fd-6b8596e79125,network=Network(48ecc0e0-69c0-4e79-a289-0bf82207c044),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3799ee-2c')#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.196 182627 INFO nova.virt.libvirt.driver [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Deleting instance files /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8_del#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.203 182627 INFO nova.virt.libvirt.driver [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Deletion of /var/lib/nova/instances/5b929866-486a-4348-9787-e2f273dbecc8_del complete#033[00m
Jan 22 17:35:55 np0005592767 podman[227007]: 2026-01-22 22:35:55.216464961 +0000 UTC m=+0.041444924 container remove b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.222 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[adc86f1b-9948-4fe4-a118-e90f65854221]: (4, ('Thu Jan 22 10:35:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044 (b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb)\nb0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb\nThu Jan 22 10:35:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044 (b0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb)\nb0a3bc98406b01c1cc3ea3f9e1011e8a510c4cb5ecf5c2101e50f83ddccac9eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.224 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[432ec846-7f89-4bce-8645-2b0c3f5b33e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.225 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48ecc0e0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:55 np0005592767 kernel: tap48ecc0e0-60: left promiscuous mode
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.227 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.273 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.277 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2074197b-e7f8-4b09-af0a-c2f8851de035]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.294 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9bcac4-f51d-4402-ad6d-69a376972b38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.295 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceba060-a039-4211-a6fa-fee5abc8cc84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.311 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e5f8851-4911-4bc7-9f70-8c86c0937655]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494467, 'reachable_time': 18908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227021, 'error': None, 'target': 'ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 systemd[1]: run-netns-ovnmeta\x2d48ecc0e0\x2d69c0\x2d4e79\x2da289\x2d0bf82207c044.mount: Deactivated successfully.
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.313 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48ecc0e0-69c0-4e79-a289-0bf82207c044 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.313 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c36fb9-c9d8-46c6-afe6-26036fe96bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.314 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 in datapath 48ecc0e0-69c0-4e79-a289-0bf82207c044 unbound from our chassis#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.315 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48ecc0e0-69c0-4e79-a289-0bf82207c044, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.316 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.316 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[48d6d3d1-e5c2-472d-a861-022063da3234]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.316 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.317 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 in datapath 48ecc0e0-69c0-4e79-a289-0bf82207c044 unbound from our chassis#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.318 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48ecc0e0-69c0-4e79-a289-0bf82207c044, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:35:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:55.318 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca2d8ac-e34b-4d1c-b203-c4b7092f49b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.340 182627 DEBUG nova.objects.instance [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 5b929866-486a-4348-9787-e2f273dbecc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.476 182627 DEBUG nova.compute.provider_tree [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.492 182627 DEBUG nova.scheduler.client.report [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.580 182627 DEBUG oslo_concurrency.lockutils [None req-c7685fc1-10ce-405f-adc7-57be46fcb618 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.668 182627 DEBUG nova.network.neutron [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updating instance_info_cache with network_info: [{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.705 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.832 182627 DEBUG nova.compute.manager [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.832 182627 DEBUG oslo_concurrency.lockutils [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.833 182627 DEBUG oslo_concurrency.lockutils [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.833 182627 DEBUG oslo_concurrency.lockutils [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.833 182627 DEBUG nova.compute.manager [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:55 np0005592767 nova_compute[182623]: 2026-01-22 22:35:55.833 182627 WARNING nova.compute.manager [req-1988e960-d527-430c-b750-929ee3e25afa req-f83b5d56-3802-4423-b7f8-70bf6dd6938d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:56 np0005592767 nova_compute[182623]: 2026-01-22 22:35:56.128 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:35:56 np0005592767 nova_compute[182623]: 2026-01-22 22:35:56.583 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.458 182627 DEBUG nova.compute.manager [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.459 182627 DEBUG oslo_concurrency.lockutils [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.460 182627 DEBUG oslo_concurrency.lockutils [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.460 182627 DEBUG oslo_concurrency.lockutils [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.460 182627 DEBUG nova.compute.manager [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.461 182627 WARNING nova.compute.manager [req-128e2e32-665d-4329-a467-a99dba54e00a req-f33e910f-d485-4095-94fd-b36ce2906fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-unplugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.975 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.975 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.976 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.976 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.977 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.977 182627 WARNING nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.978 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.978 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.978 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.979 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.979 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.980 182627 WARNING nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.980 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.981 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.981 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.982 182627 DEBUG oslo_concurrency.lockutils [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.982 182627 DEBUG nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:57 np0005592767 nova_compute[182623]: 2026-01-22 22:35:57.982 182627 WARNING nova.compute.manager [req-973f2f88-ce26-4124-b22f-7f3e58d12edb req-d1f3db97-0948-4ca6-850f-ca5978211589 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:35:58 np0005592767 kernel: tap4b5c1570-4e (unregistering): left promiscuous mode
Jan 22 17:35:58 np0005592767 NetworkManager[54973]: <info>  [1769121358.2948] device (tap4b5c1570-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.300 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:58Z|00427|binding|INFO|Releasing lport 4b5c1570-4e54-4f2b-a349-702a4160e13a from this chassis (sb_readonly=0)
Jan 22 17:35:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:58Z|00428|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a down in Southbound
Jan 22 17:35:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:35:58Z|00429|binding|INFO|Removing iface tap4b5c1570-4e ovn-installed in OVS
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.315 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.318 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.318 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.322 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.322 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3becde92-8d83-4ef1-9e9a-f86fabe80279]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.323 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore#033[00m
Jan 22 17:35:58 np0005592767 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 22 17:35:58 np0005592767 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006e.scope: Consumed 12.397s CPU time.
Jan 22 17:35:58 np0005592767 systemd-machined[153912]: Machine qemu-55-instance-0000006e terminated.
Jan 22 17:35:58 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [NOTICE]   (226671) : haproxy version is 2.8.14-c23fe91
Jan 22 17:35:58 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [NOTICE]   (226671) : path to executable is /usr/sbin/haproxy
Jan 22 17:35:58 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [WARNING]  (226671) : Exiting Master process...
Jan 22 17:35:58 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [ALERT]    (226671) : Current worker (226673) exited with code 143 (Terminated)
Jan 22 17:35:58 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[226667]: [WARNING]  (226671) : All workers exited. Exiting... (0)
Jan 22 17:35:58 np0005592767 systemd[1]: libpod-22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7.scope: Deactivated successfully.
Jan 22 17:35:58 np0005592767 podman[227046]: 2026-01-22 22:35:58.452992838 +0000 UTC m=+0.038957233 container died 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:35:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7-userdata-shm.mount: Deactivated successfully.
Jan 22 17:35:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay-88f93a804cd01757ba5d73a89cf9f5e4b94ff11206e000d609457a521d17c14e-merged.mount: Deactivated successfully.
Jan 22 17:35:58 np0005592767 podman[227046]: 2026-01-22 22:35:58.488167803 +0000 UTC m=+0.074132168 container cleanup 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:35:58 np0005592767 systemd[1]: libpod-conmon-22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7.scope: Deactivated successfully.
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.545 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.550 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 podman[227075]: 2026-01-22 22:35:58.551286499 +0000 UTC m=+0.039205200 container remove 22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.559 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[78227511-5745-4f0a-8400-a6d28bd36aee]: (4, ('Thu Jan 22 10:35:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7)\n22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7\nThu Jan 22 10:35:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7)\n22e4aafb495b718fbbc223aae504ff49d8fa4c1badf773282c65381b6e9030c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.561 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8ab25d-55fb-4d10-900d-f683e9704097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.562 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.563 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.579 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.581 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9df5e7c1-fc36-4e6e-8ef4-d973c0d1da85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.592 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[94a75b36-1b5c-42de-a6fc-37a6a2d2f181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.593 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6574b795-c0e4-4a2f-b3d1-0bab4d878cd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.612 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c760fe-70a1-4ee7-993f-f9fb3176485c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493672, 'reachable_time': 18108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227110, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.613 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:35:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:35:58.614 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c07947f0-5fba-413c-a625-220a19d3fdf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:35:58 np0005592767 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.958 182627 DEBUG nova.compute.manager [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.958 182627 DEBUG oslo_concurrency.lockutils [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.959 182627 DEBUG oslo_concurrency.lockutils [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.959 182627 DEBUG oslo_concurrency.lockutils [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.959 182627 DEBUG nova.compute.manager [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:35:58 np0005592767 nova_compute[182623]: 2026-01-22 22:35:58.960 182627 WARNING nova.compute.manager [req-7a8cc4e6-df15-4dd8-81b9-ac073162331d req-c774e4cc-0216-40c3-bd46-2d0cc75f555b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.160 182627 INFO nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance shutdown successfully after 3 seconds.#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.167 182627 INFO nova.virt.libvirt.driver [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance destroyed successfully.#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.167 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'numa_topology' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.189 182627 INFO nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Attempting a stable device rescue#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.623 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.629 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.630 182627 INFO nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Creating image(s)#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.631 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.631 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.632 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.633 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.656 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.658 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.970 182627 DEBUG nova.compute.manager [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-changed-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.971 182627 DEBUG nova.compute.manager [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Refreshing instance network info cache due to event network-changed-0b3799ee-2c54-4b41-a4fd-6b8596e79125. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.971 182627 DEBUG oslo_concurrency.lockutils [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.972 182627 DEBUG oslo_concurrency.lockutils [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:35:59 np0005592767 nova_compute[182623]: 2026-01-22 22:35:59.972 182627 DEBUG nova.network.neutron [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Refreshing network info cache for port 0b3799ee-2c54-4b41-a4fd-6b8596e79125 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:36:00 np0005592767 podman[227111]: 2026-01-22 22:36:00.132643875 +0000 UTC m=+0.054501793 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:00 np0005592767 podman[227112]: 2026-01-22 22:36:00.134333543 +0000 UTC m=+0.055106581 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.194 182627 DEBUG nova.compute.manager [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.194 182627 DEBUG oslo_concurrency.lockutils [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.195 182627 DEBUG oslo_concurrency.lockutils [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.195 182627 DEBUG oslo_concurrency.lockutils [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.195 182627 DEBUG nova.compute.manager [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.195 182627 WARNING nova.compute.manager [req-21f6b622-6aed-47d3-ac16-1089b0c98518 req-a13cbc3b-a512-4f46-9f7c-412496506491 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:36:00 np0005592767 nova_compute[182623]: 2026-01-22 22:36:00.196 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.149 182627 DEBUG nova.compute.manager [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.151 182627 DEBUG oslo_concurrency.lockutils [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.151 182627 DEBUG oslo_concurrency.lockutils [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.152 182627 DEBUG oslo_concurrency.lockutils [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.152 182627 DEBUG nova.compute.manager [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.152 182627 WARNING nova.compute.manager [req-50b69e69-dd8b-45a7-9e99-5181350f71bc req-19d04ac8-22d5-4adc-9dff-5b93436a0170 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.514 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.572 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.part --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.574 182627 DEBUG nova.virt.images [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] 39987214-2dc0-4220-bd0c-f0d5eb91daa7 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.574 182627 DEBUG nova.privsep.utils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.575 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.part /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.591 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.727 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.part /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.converted" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.733 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.786 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5.converted --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.787 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.800 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.801 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.812 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.873 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.874 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5,backing_fmt=raw /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.914 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5,backing_fmt=raw /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.rescue" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.915 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "7c9dda9354a1b10fc44c169b7a889804e407fad5" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.916 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'migration_context' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.939 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.941 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start _get_guest_xml network_info=[{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:b1:99:38"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '39987214-2dc0-4220-bd0c-f0d5eb91daa7', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.942 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'resources' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.959 182627 WARNING nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.964 182627 DEBUG nova.virt.libvirt.host [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.965 182627 DEBUG nova.virt.libvirt.host [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.969 182627 DEBUG nova.virt.libvirt.host [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.969 182627 DEBUG nova.virt.libvirt.host [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.971 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.971 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.972 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.972 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.972 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.972 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.973 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.973 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.973 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.973 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.974 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.974 182627 DEBUG nova.virt.hardware [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.974 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:01 np0005592767 nova_compute[182623]: 2026-01-22 22:36:01.989 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.045 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.046 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.046 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.047 182627 DEBUG oslo_concurrency.lockutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.048 182627 DEBUG nova.virt.libvirt.vif [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1138865448',display_name='tempest-ServerStableDeviceRescueTest-server-1138865448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1138865448',id=110,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:35:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-6arbga1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:35:49Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=f160acde-2aa8-4109-94ea-ba98aaf63ad3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:b1:99:38"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.049 182627 DEBUG nova.network.os_vif_util [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "vif_mac": "fa:16:3e:b1:99:38"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.049 182627 DEBUG nova.network.os_vif_util [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.051 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'pci_devices' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.066 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <uuid>f160acde-2aa8-4109-94ea-ba98aaf63ad3</uuid>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <name>instance-0000006e</name>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1138865448</nova:name>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:36:01</nova:creationTime>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:user uuid="9d1e26d3056148e692e157703469d77a">tempest-ServerStableDeviceRescueTest-395714292-project-member</nova:user>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:project uuid="9b1f07a8546648baba916fffc53a0b93">tempest-ServerStableDeviceRescueTest-395714292</nova:project>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        <nova:port uuid="4b5c1570-4e54-4f2b-a349-702a4160e13a">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="serial">f160acde-2aa8-4109-94ea-ba98aaf63ad3</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="uuid">f160acde-2aa8-4109-94ea-ba98aaf63ad3</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.rescue"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <target dev="vdb" bus="virtio"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <boot order="1"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b1:99:38"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <target dev="tap4b5c1570-4e"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/console.log" append="off"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:36:02 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:36:02 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:36:02 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:36:02 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.073 182627 INFO nova.virt.libvirt.driver [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance destroyed successfully.#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.134 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.134 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.135 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.135 182627 DEBUG nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] No VIF found with MAC fa:16:3e:b1:99:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.136 182627 INFO nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Using config drive#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.420 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.487 182627 DEBUG nova.objects.instance [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'keypairs' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.558 182627 DEBUG nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.558 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.559 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.560 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.560 182627 DEBUG nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.560 182627 WARNING nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.561 182627 DEBUG nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.561 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5b929866-486a-4348-9787-e2f273dbecc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.561 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.561 182627 DEBUG oslo_concurrency.lockutils [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5b929866-486a-4348-9787-e2f273dbecc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.562 182627 DEBUG nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] No waiting events found dispatching network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:02 np0005592767 nova_compute[182623]: 2026-01-22 22:36:02.562 182627 WARNING nova.compute.manager [req-bda627a1-2675-416c-99c4-81c346196796 req-3fc22a3f-815e-462b-b00b-f03ffc633a35 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Received unexpected event network-vif-plugged-0b3799ee-2c54-4b41-a4fd-6b8596e79125 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.085 182627 DEBUG nova.network.neutron [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updated VIF entry in instance network info cache for port 0b3799ee-2c54-4b41-a4fd-6b8596e79125. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.085 182627 DEBUG nova.network.neutron [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Updating instance_info_cache with network_info: [{"id": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "address": "fa:16:3e:d7:b3:21", "network": {"id": "48ecc0e0-69c0-4e79-a289-0bf82207c044", "bridge": "br-int", "label": "tempest-network-smoke--154670647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3799ee-2c", "ovs_interfaceid": "0b3799ee-2c54-4b41-a4fd-6b8596e79125", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.107 182627 DEBUG oslo_concurrency.lockutils [req-7693d357-3dcc-465d-9b85-89d184dcf697 req-73bbfb43-73ab-4078-8a84-638b8ae88b77 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5b929866-486a-4348-9787-e2f273dbecc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.259 182627 INFO nova.virt.libvirt.driver [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Creating config drive at /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config.rescue#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.263 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw1ylm9ly execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.396 182627 DEBUG oslo_concurrency.processutils [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw1ylm9ly" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:03 np0005592767 kernel: tap4b5c1570-4e: entered promiscuous mode
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.4617] manager: (tap4b5c1570-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Jan 22 17:36:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:03Z|00430|binding|INFO|Claiming lport 4b5c1570-4e54-4f2b-a349-702a4160e13a for this chassis.
Jan 22 17:36:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:03Z|00431|binding|INFO|4b5c1570-4e54-4f2b-a349-702a4160e13a: Claiming fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.463 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.474 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '5', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.476 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 bound to our chassis#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.478 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2#033[00m
Jan 22 17:36:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:03Z|00432|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a ovn-installed in OVS
Jan 22 17:36:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:03Z|00433|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a up in Southbound
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.482 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.484 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.489 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b99ef3ce-cc0a-4951-a09e-c6f6c23e45bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.490 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.492 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.492 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[55c7a13a-f2d3-4730-984b-cda56b22a5f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.492 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d407b701-30b7-46bd-8994-cbd057838196]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 systemd-machined[153912]: New machine qemu-57-instance-0000006e.
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.505 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5ecba1-1d28-449a-96b2-c4d5b13ed2a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 systemd[1]: Started Virtual Machine qemu-57-instance-0000006e.
Jan 22 17:36:03 np0005592767 systemd-udevd[227196]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.531 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3155d2bf-9f2d-48e1-b2bf-58b4eb5292b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.5349] device (tap4b5c1570-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.5356] device (tap4b5c1570-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.567 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b223fb19-cfbb-4a54-af9a-001b5a8adb65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.5755] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.574 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[520c654c-ea32-415d-94b7-1037547a3d20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.608 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[53de4585-8e87-4f7e-a29f-9bb585f8d65c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.610 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e6c817-a55d-42d1-926f-16c8513db917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.6351] device (tapad2345e3-00): carrier: link connected
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.641 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa4e020-793a-4005-8fbd-c9e2a1164482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.656 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b78d15-8f22-4989-a69c-c20b53f2b975]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496021, 'reachable_time': 24348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227226, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.673 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b57245ad-168d-4efb-ba3d-51270f783154]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496021, 'tstamp': 496021}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227227, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.689 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b4787d98-6996-4729-8239-c43b078d391b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496021, 'reachable_time': 24348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227228, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.718 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84c84d72-2f74-4143-b953-acc6f2ddf43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.772 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[743a4a26-3744-4c3a-af85-b2fc04ba22ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.774 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.774 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.774 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:03 np0005592767 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 17:36:03 np0005592767 NetworkManager[54973]: <info>  [1769121363.7769] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.777 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.781 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.782 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:03Z|00434|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.785 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.786 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8e584288-7a78-4352-8373-48cc74b5409c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.786 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:36:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:03.787 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.795 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.943 182627 DEBUG nova.compute.manager [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.944 182627 DEBUG oslo_concurrency.lockutils [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.944 182627 DEBUG oslo_concurrency.lockutils [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.945 182627 DEBUG oslo_concurrency.lockutils [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.945 182627 DEBUG nova.compute.manager [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:03 np0005592767 nova_compute[182623]: 2026-01-22 22:36:03.945 182627 WARNING nova.compute.manager [req-d1979f56-c17a-49d5-8efe-092eaa2cf6c0 req-c40d8305-0c5b-4ef7-a597-692939f001e2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:36:04 np0005592767 podman[227260]: 2026-01-22 22:36:04.156121979 +0000 UTC m=+0.062400497 container create 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:04 np0005592767 systemd[1]: Started libpod-conmon-4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04.scope.
Jan 22 17:36:04 np0005592767 podman[227260]: 2026-01-22 22:36:04.120708927 +0000 UTC m=+0.026987535 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:36:04 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:36:04 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/663aa6b20fcd384f443e6364f4bbe199ecc569ab64898037cbb0db28b1048da7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:36:04 np0005592767 podman[227260]: 2026-01-22 22:36:04.263783025 +0000 UTC m=+0.170061543 container init 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:36:04 np0005592767 podman[227260]: 2026-01-22 22:36:04.274422557 +0000 UTC m=+0.180701075 container start 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:36:04 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [NOTICE]   (227278) : New worker (227280) forked
Jan 22 17:36:04 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [NOTICE]   (227278) : Loading success.
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.594 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for f160acde-2aa8-4109-94ea-ba98aaf63ad3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.594 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121364.5934541, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.595 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.611 182627 DEBUG nova.compute.manager [None req-19126209-686e-46c4-bc52-ac3d38180d37 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.645 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.648 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.684 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.684 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121364.594669, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.685 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.714 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:04 np0005592767 nova_compute[182623]: 2026-01-22 22:36:04.719 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:05.172 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:05.174 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.175 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.197 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.714 182627 INFO nova.compute.manager [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Unrescuing#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.715 182627 DEBUG oslo_concurrency.lockutils [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.715 182627 DEBUG oslo_concurrency.lockutils [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquired lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:36:05 np0005592767 nova_compute[182623]: 2026-01-22 22:36:05.715 182627 DEBUG nova.network.neutron [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.032 182627 DEBUG nova.compute.manager [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.033 182627 DEBUG oslo_concurrency.lockutils [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.033 182627 DEBUG oslo_concurrency.lockutils [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.034 182627 DEBUG oslo_concurrency.lockutils [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.034 182627 DEBUG nova.compute.manager [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.035 182627 WARNING nova.compute.manager [req-37b216c1-61f8-4df2-a7b7-7b7c74072326 req-1903d638-ee0f-4231-896a-b5b9d7efe6d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 22 17:36:06 np0005592767 nova_compute[182623]: 2026-01-22 22:36:06.590 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.327 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006e', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9b1f07a8546648baba916fffc53a0b93', 'user_id': '9d1e26d3056148e692e157703469d77a', 'hostId': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.333 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f160acde-2aa8-4109-94ea-ba98aaf63ad3 / tap4b5c1570-4e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.333 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4874113-0af0-4e30-a39d-3b2b26456edc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.329308', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be2669e4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': 'e7890a7e99bfb7f87ad1c684f3578ea63700cec4e19e9a51f50b5ac5b9294d36'}]}, 'timestamp': '2026-01-22 22:36:07.334937', '_unique_id': '43fc380c8ddb4a449d69a378253ab14a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.340 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f080be4-4ad9-410d-9466-252c3a3cac45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.340931', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be2776a4-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '476072b4c1b17487649949b785c5238103648159bedf54ee386e50f003ccea73'}]}, 'timestamp': '2026-01-22 22:36:07.341473', '_unique_id': '0713194a5f10458b929978aa3a2fe834'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.342 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.343 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.369 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/cpu volume: 2660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ac112a-b812-4ac3-af76-29f84303bb78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2660000000, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'timestamp': '2026-01-22T22:36:07.343898', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'be2bd85c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.004231701, 'message_signature': '3a2c138c95a3b9e50c7d45b065d7f68e333cd663ccce0d439c2d2746a72300b0'}]}, 'timestamp': '2026-01-22 22:36:07.370268', '_unique_id': '49d253311d0f4b4cb17e3da3842a96c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.373 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65c24620-2a1d-4240-b315-200b1676de87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.373627', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be2c7320-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': 'ba3c15286d3da49422afa4bbe058aeede62c7f98c90454b3f2d60e3da76bd841'}]}, 'timestamp': '2026-01-22 22:36:07.374160', '_unique_id': '526ea0abc0df472f8c59172c19b9a461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.377 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.377 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>]
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.377 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.439 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.440 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.441 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14568ce9-9297-432a-8f46-fa132b1b2912', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.377802', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be367f14-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'd300404357f3c0077299ea9267a40227e0679ec07befd1ea43bd6a168230e3d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.377802', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be36a412-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '9086987883a1aeaa2c054b8ad0e6f69d5b27899f7430a47f2112326cecb4126c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.377802', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be36bd94-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '0f64dde591da765d70d19bd16a4b373af09ac73ccb6555bd46642057aadde8df'}]}, 'timestamp': '2026-01-22 22:36:07.441672', '_unique_id': '4ba091b710ef4ddca8f4bb72d042dbbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.443 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.445 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.445 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0394c1e-ae5d-40a7-942e-cd3458ae5ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.445704', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be3774aa-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '47e1d9554db1276e62f0778101a793c87aedd5f6cc5e34c713efc18d8d09cfcd'}]}, 'timestamp': '2026-01-22 22:36:07.446447', '_unique_id': 'd14245dba5da444c9aca901554d0839d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.480 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.481 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.482 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd99d84f8-8a7f-428f-94a7-9c7f38a8b25f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.449789', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be3cd648-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': '82e5a9b83b9acd252294b1cc7fd505856fae99a875c8d87a7aab587885f4d8d9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.449789', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be3cf038-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': 'e1560e0eb4e6fe713e2710f6b0ea3389c0398bce213b9d6679571331bea98bf7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.449789', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be3d0a00-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': '80d640ebda45a9ee94ba5e6f9191582ea857e97e3c03336cfe8ee092338bd3da'}]}, 'timestamp': '2026-01-22 22:36:07.482948', '_unique_id': '03b8126695fd4b6c8b7315d3db407654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.484 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.486 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.487 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>]
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.487 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.487 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.488 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.488 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73645531-e076-4216-95eb-07de6a95f12b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.487641', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be3dd7d2-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'd92517239a315e8aa58b7fb6d6c10aa5c806867bb6d53c11ee7037e9c7c76fc1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.487641', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be3defd8-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'c0ad7182ffa4eb2b73cc581402dcc0e7dc42cea7e9d500bde825ac6129a52ad9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.487641', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be3e07b6-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '7fc80e4d2a7b648e78164d4d6ffe417f43927d34285683a2dcee8b907614082d'}]}, 'timestamp': '2026-01-22 22:36:07.489463', '_unique_id': 'a080aeb048824689a1811ac32dba11c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.490 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.492 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.492 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97c4b525-ea59-4de4-86a5-6ea8462ceec0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.492911', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be3eaae0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': 'ff3eb235ae3b0f67b433844888b8dcf7ac99c118b4b3779c518ac572199c04e0'}]}, 'timestamp': '2026-01-22 22:36:07.493640', '_unique_id': 'bcb9ebfd24554227988d820fae930a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.496 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.497 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.498 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deea572e-d859-478f-8bb1-46d6353baea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.496867', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be3f445a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '4ec67c46b0bb9e011b53bd892ac527c70f28fdf4be46da905f8abf7821d899ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.496867', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be3f5c9c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '7bea230756426260edc432decaafe80bd7368bd85bcf3c9be4584f8f43628546'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.496867', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be3f74f2-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '7c959040c97015a1d7eed5b2771f24cdddb4b4ee213c633ee4be153b964843f4'}]}, 'timestamp': '2026-01-22 22:36:07.498769', '_unique_id': 'f87e00f3894649969e5a6d30601be80b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.501 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.501 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.501 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.502 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '619d0649-41dd-4fca-87d1-897c989b60b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.501488', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be3ff36e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '693c797ed71b6ab816c5365b6527a91979041c8c3f61d8cf8e7bbe9cc1b8c5d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.501488', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be4003ae-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '51be47ab989e3b74992dbb2a59cf3ade65d912db09217462bb207f00eba04ea4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.501488', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be4013ee-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '2f0da636cae54a35c658bc31a4472c0dd0ed988dabae5dae96a214a00dfac675'}]}, 'timestamp': '2026-01-22 22:36:07.502737', '_unique_id': 'fd35261b7d164bb397860fa27dcd3b57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.504 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.505 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.505 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f160acde-2aa8-4109-94ea-ba98aaf63ad3: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.505 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.506 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.latency volume: 144219035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.506 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.latency volume: 422502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c99d402e-bd64-4539-a1bb-f946f3e9284f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.505578', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be40931e-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '28413d42524bf0051b2e3e51362a22453d2b851e2d1ca6c7f4044568915b4226'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 144219035, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.505578', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be40a476-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'a3dc4a9b1787d2eb45592bed60e6f2590a3eb7011952dbf0f065edaa4b73aa24'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 422502, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.505578', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be40b3d0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'edb177ff2677b77d667aac07d0a88becffa5a9efc172a3382aa3ee5ecacfe973'}]}, 'timestamp': '2026-01-22 22:36:07.506839', '_unique_id': 'e2c80ea05a414ed781a48c92446dbde4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.508 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.509 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.509 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>]
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.509 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.509 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bda425b7-0314-4a1e-a0cc-a7787c17d003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.509565', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be412d74-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '777f2d16d38c0a65d8cd08c442140c3cefbee385f26a7e51e4377f741f6f444b'}]}, 'timestamp': '2026-01-22 22:36:07.509923', '_unique_id': 'a88c83a1d7bc468482948b9b3e5cf591'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.510 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.512 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.512 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.513 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8390133a-8860-49af-b21f-feee240a684b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.512131', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be4194ee-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'eb74103aec4b23f9cdf28631101c744b4e426ad1aad755d0d805173adfffe287'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.512131', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be41a574-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': 'ce354dca8eefa2849b2587f0affba93b4787e946d6714a5bc8f483abf0b89b40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.512131', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be41b636-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.012729731, 'message_signature': '6090291ec17ccf8a5e5ee5ac30f9a58d38fbeabeb81a8319af536a6b61418c91'}]}, 'timestamp': '2026-01-22 22:36:07.513460', '_unique_id': '85bb0cee9fb74ccc96d5b3326ce99802'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.514 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.515 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.515 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66d2ead3-c1c5-4c9e-bfd2-8f5dd731c2d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.515897', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be4226fc-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '4e69fadf5c949dffa0aff470e6d767cf906b1252600d9770121d1661c1a20d82'}]}, 'timestamp': '2026-01-22 22:36:07.516400', '_unique_id': '44470e5c382641caa72e4a31bbb8dd2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.518 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.518 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.518 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1138865448>]
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.518 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.519 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.519 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.520 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa87b2e-e272-447d-9f0b-db2756e65691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.519098', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be42a46a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': '7c1d3494e74d1fab22683613f344cc7be038ecb3d547d115049e7598ae895a4d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.519098', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be42b61c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': 'b0e247daa93d882d76a792bf4c6b012558e9c6f79d2bf949323658df0baeea6f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.519098', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be42c6c0-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': '8f126aed9d93fd191afdabdba92c3593f9897f510e2b6a97d60d7999003a590e'}]}, 'timestamp': '2026-01-22 22:36:07.520400', '_unique_id': '84a473a3896d41328ff435117f84f151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.521 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.522 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.522 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.522 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.523 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33ca1c08-9156-401b-b3cc-a054c15e2f3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vda', 'timestamp': '2026-01-22T22:36:07.522324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'be43219c-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': 'cbad2d53f4d1ad2a521507ab594494febb95541ca4d035be9257a39ff221dc67'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-vdb', 'timestamp': '2026-01-22T22:36:07.522324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': 'be433182-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': 'dadb68d7b73faa48a511d09a103cacdc107ed5bf455c123cce971f2b5e6fdce6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3-sda', 'timestamp': '2026-01-22T22:36:07.522324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'instance-0000006e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'be434230-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4964.084722848, 'message_signature': 'dde9d43a95280d4eb67ca2c406791a6cfe8744efff86e48ad374fdcbceb2bbed'}]}, 'timestamp': '2026-01-22 22:36:07.523616', '_unique_id': 'c12780a05959421e958a9aff9620471a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.524 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.525 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4b55b50-724e-41df-9bcb-640c603efdca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.525852', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be43ab8a-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '73849b03758bd66336d9fdc181559291d9899aadc83a26c51ea271846d9d352c'}]}, 'timestamp': '2026-01-22 22:36:07.526346', '_unique_id': 'd66e7976781c49b9bb9a81c3d79e3a7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.527 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.528 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f3db1a5-0d3e-4b9b-a842-4b6d2b0b2952', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.528510', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be441336-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': 'd226cb6984d1dced2b097fac5dc8bb57a4be138351cf05b79e8d59d806e05acf'}]}, 'timestamp': '2026-01-22 22:36:07.528966', '_unique_id': '4c38b0f9462249828e9cc7da0c923f2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.529 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.531 12 DEBUG ceilometer.compute.pollsters [-] f160acde-2aa8-4109-94ea-ba98aaf63ad3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '638b6d27-6831-41a6-b96a-1d6f1af0e7eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9d1e26d3056148e692e157703469d77a', 'user_name': None, 'project_id': '9b1f07a8546648baba916fffc53a0b93', 'project_name': None, 'resource_id': 'instance-0000006e-f160acde-2aa8-4109-94ea-ba98aaf63ad3-tap4b5c1570-4e', 'timestamp': '2026-01-22T22:36:07.531100', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1138865448', 'name': 'tap4b5c1570-4e', 'instance_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'instance_type': 'm1.nano', 'host': '594f90839929a1d12868722919752100784198df23aee2d2807755e0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b1:99:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4b5c1570-4e'}, 'message_id': 'be4479ca-f7e2-11f0-a43a-fa163ed01feb', 'monotonic_time': 4963.96425602, 'message_signature': '4adc8660813226ad3f9d708cc96291d5a3aa0d43f3f00c3ebe62f91703da36ec'}]}, 'timestamp': '2026-01-22 22:36:07.531621', '_unique_id': '74e94f7523324935a7ae4dac200ed4a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:36:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:36:07.532 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:36:08 np0005592767 podman[227296]: 2026-01-22 22:36:08.158508573 +0000 UTC m=+0.077068697 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.164 182627 DEBUG nova.network.neutron [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updating instance_info_cache with network_info: [{"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.176 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.188 182627 DEBUG oslo_concurrency.lockutils [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Releasing lock "refresh_cache-f160acde-2aa8-4109-94ea-ba98aaf63ad3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.189 182627 DEBUG nova.objects.instance [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'flavor' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:09 np0005592767 kernel: tap4b5c1570-4e (unregistering): left promiscuous mode
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.2975] device (tap4b5c1570-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00435|binding|INFO|Releasing lport 4b5c1570-4e54-4f2b-a349-702a4160e13a from this chassis (sb_readonly=0)
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00436|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a down in Southbound
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00437|binding|INFO|Removing iface tap4b5c1570-4e ovn-installed in OVS
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.306 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.308 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.329 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '6', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.330 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.332 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.334 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f250f2-b998-47dd-ac6d-1578486cf284]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.335 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore#033[00m
Jan 22 17:36:09 np0005592767 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 22 17:36:09 np0005592767 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000006e.scope: Consumed 5.883s CPU time.
Jan 22 17:36:09 np0005592767 systemd-machined[153912]: Machine qemu-57-instance-0000006e terminated.
Jan 22 17:36:09 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [NOTICE]   (227278) : haproxy version is 2.8.14-c23fe91
Jan 22 17:36:09 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [NOTICE]   (227278) : path to executable is /usr/sbin/haproxy
Jan 22 17:36:09 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [WARNING]  (227278) : Exiting Master process...
Jan 22 17:36:09 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [ALERT]    (227278) : Current worker (227280) exited with code 143 (Terminated)
Jan 22 17:36:09 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227274]: [WARNING]  (227278) : All workers exited. Exiting... (0)
Jan 22 17:36:09 np0005592767 systemd[1]: libpod-4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04.scope: Deactivated successfully.
Jan 22 17:36:09 np0005592767 podman[227345]: 2026-01-22 22:36:09.463643472 +0000 UTC m=+0.046004676 container died 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:36:09 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04-userdata-shm.mount: Deactivated successfully.
Jan 22 17:36:09 np0005592767 systemd[1]: var-lib-containers-storage-overlay-663aa6b20fcd384f443e6364f4bbe199ecc569ab64898037cbb0db28b1048da7-merged.mount: Deactivated successfully.
Jan 22 17:36:09 np0005592767 podman[227345]: 2026-01-22 22:36:09.494531508 +0000 UTC m=+0.076892722 container cleanup 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:36:09 np0005592767 systemd[1]: libpod-conmon-4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04.scope: Deactivated successfully.
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.545 182627 INFO nova.virt.libvirt.driver [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance destroyed successfully.#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.545 182627 DEBUG nova.objects.instance [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'numa_topology' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:09 np0005592767 podman[227377]: 2026-01-22 22:36:09.55235206 +0000 UTC m=+0.038737854 container remove 4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.557 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4fba13-49ad-472f-9389-43dc12aa1e29]: (4, ('Thu Jan 22 10:36:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04)\n4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04\nThu Jan 22 10:36:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04)\n4791bfb83137727e36cf43090b1ff21c106f2041beb9da98f5077ae980339d04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.559 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0252b847-49ce-4e13-a346-ea4d31455480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.560 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.562 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.577 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.580 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0552e22f-c333-4791-834d-b61b6eb3e7fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.593 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba13f77-56b4-4783-b356-7d12eeaddaa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.594 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a507d4-87be-41c3-9e58-652975722960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.608 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[12667c0b-ee1a-4fc0-ad2a-cd602d97fc7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496013, 'reachable_time': 22965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227415, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.611 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.611 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a21dcb-f134-4586-a02d-62d9ad1d370c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 17:36:09 np0005592767 kernel: tap4b5c1570-4e: entered promiscuous mode
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.6300] manager: (tap4b5c1570-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Jan 22 17:36:09 np0005592767 systemd-udevd[227325]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00438|binding|INFO|Claiming lport 4b5c1570-4e54-4f2b-a349-702a4160e13a for this chassis.
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00439|binding|INFO|4b5c1570-4e54-4f2b-a349-702a4160e13a: Claiming fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.631 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.6407] device (tap4b5c1570-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.6411] device (tap4b5c1570-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.641 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '6', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.642 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 bound to our chassis#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.644 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2345e3-0b74-4aee-aa42-da6620725bb2#033[00m
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00440|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a up in Southbound
Jan 22 17:36:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:09Z|00441|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a ovn-installed in OVS
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.654 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.656 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a2ead2-2268-4ca0-85e0-d104502b0363]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.657 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2345e3-01 in ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.658 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.659 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2345e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.659 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[08f4d519-3f83-496f-b84a-e663161dc0de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.661 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[99f4389e-caaa-4839-91b7-cff9cee60a71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.671 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[b447f087-546e-465f-8645-38909ff1b161]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 systemd-machined[153912]: New machine qemu-58-instance-0000006e.
Jan 22 17:36:09 np0005592767 systemd[1]: Started Virtual Machine qemu-58-instance-0000006e.
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.701 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[28250218-5a7a-420e-93da-f036bbeb2f49]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.733 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d780aa33-0176-4b51-9d8f-94c10b517c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.7394] manager: (tapad2345e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/204)
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.738 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eddad8f9-ff7d-4b09-8d14-4a05b1a977c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.782 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8e864ae8-e383-44c6-a92c-c1830ff675bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.786 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9f65f8-3810-4f9d-a2ee-abceba11f1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 NetworkManager[54973]: <info>  [1769121369.8109] device (tapad2345e3-00): carrier: link connected
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.820 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[60fd4aeb-bbdd-4489-95d2-8293acd3d738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.841 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fe50bbac-0552-4732-9a21-cf63670a695c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496638, 'reachable_time': 21143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227457, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.862 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dc501898-3673-40c9-88e5-3e99a6007abe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:33c3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496638, 'tstamp': 496638}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227458, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.866 182627 DEBUG nova.compute.manager [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.867 182627 DEBUG oslo_concurrency.lockutils [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.867 182627 DEBUG oslo_concurrency.lockutils [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.868 182627 DEBUG oslo_concurrency.lockutils [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.869 182627 DEBUG nova.compute.manager [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:09 np0005592767 nova_compute[182623]: 2026-01-22 22:36:09.869 182627 WARNING nova.compute.manager [req-35442443-581e-481f-942d-49cd8ebdac0d req-ef4fba95-8f3f-41fb-b46a-997fcf48cea6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.895 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[927aa632-5445-4697-ae14-b7b4c4d6f267]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2345e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:33:c3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496638, 'reachable_time': 21143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227459, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:09.940 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[55aa34e2-4542-401b-aedb-7939109006dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.010 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4d1200-4118-42f3-bc88-4dc58ad2d01d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.013 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.013 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.014 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2345e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:10 np0005592767 NetworkManager[54973]: <info>  [1769121370.0174] manager: (tapad2345e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 22 17:36:10 np0005592767 kernel: tapad2345e3-00: entered promiscuous mode
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.024 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2345e3-00, col_values=(('external_ids', {'iface-id': 'bd160f04-1c71-4851-91cb-64d88f335d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:10 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:10Z|00442|binding|INFO|Releasing lport bd160f04-1c71-4851-91cb-64d88f335d22 from this chassis (sb_readonly=0)
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.029 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.031 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[60d037a8-050b-4790-a53e-21808bb74f47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.032 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/ad2345e3-0b74-4aee-aa42-da6620725bb2.pid.haproxy
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID ad2345e3-0b74-4aee-aa42-da6620725bb2
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:36:10 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:10.033 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'env', 'PROCESS_TAG=haproxy-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2345e3-0b74-4aee-aa42-da6620725bb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.092 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for f160acde-2aa8-4109-94ea-ba98aaf63ad3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.093 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121370.091519, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.093 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.097 182627 DEBUG nova.compute.manager [None req-bdec4422-ca93-41ad-9f0a-d68280942ba1 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.121 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.125 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.150 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.151 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121370.0951693, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.151 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Started (Lifecycle Event)#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.171 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121355.1709883, 5b929866-486a-4348-9787-e2f273dbecc8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.173 182627 INFO nova.compute.manager [-] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.174 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.178 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.197 182627 DEBUG nova.compute.manager [None req-0b601360-4ea9-4dad-9dbb-9aa8094e2057 - - - - - -] [instance: 5b929866-486a-4348-9787-e2f273dbecc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:10 np0005592767 nova_compute[182623]: 2026-01-22 22:36:10.199 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:10 np0005592767 podman[227498]: 2026-01-22 22:36:10.413444233 +0000 UTC m=+0.050755087 container create ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:36:10 np0005592767 systemd[1]: Started libpod-conmon-ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c.scope.
Jan 22 17:36:10 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:36:10 np0005592767 podman[227498]: 2026-01-22 22:36:10.383932135 +0000 UTC m=+0.021243009 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:36:10 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ef77e5ea87f33d5d2a76fb3fa690f80b7f4a95b2b048f5b930cdee9190873ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:36:10 np0005592767 podman[227498]: 2026-01-22 22:36:10.501163734 +0000 UTC m=+0.138474598 container init ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:36:10 np0005592767 podman[227498]: 2026-01-22 22:36:10.510901384 +0000 UTC m=+0.148212238 container start ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:36:10 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [NOTICE]   (227518) : New worker (227520) forked
Jan 22 17:36:10 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [NOTICE]   (227518) : Loading success.
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.591 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.959 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.959 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.960 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.960 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.960 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.961 182627 WARNING nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.961 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.961 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.961 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.962 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.962 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.962 182627 WARNING nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.963 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.963 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.963 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.963 182627 DEBUG oslo_concurrency.lockutils [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.964 182627 DEBUG nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:11 np0005592767 nova_compute[182623]: 2026-01-22 22:36:11.964 182627 WARNING nova.compute.manager [req-6597c361-b061-4195-a145-3c24c9c1cd07 req-4a4d3776-5b92-4d74-884e-690aa4d26721 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:36:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:12.109 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:12.110 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:12.111 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.201 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.848 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.848 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.849 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.849 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.849 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.860 182627 INFO nova.compute.manager [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Terminating instance#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.870 182627 DEBUG nova.compute.manager [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:36:15 np0005592767 kernel: tap4b5c1570-4e (unregistering): left promiscuous mode
Jan 22 17:36:15 np0005592767 NetworkManager[54973]: <info>  [1769121375.8850] device (tap4b5c1570-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:36:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:15Z|00443|binding|INFO|Releasing lport 4b5c1570-4e54-4f2b-a349-702a4160e13a from this chassis (sb_readonly=0)
Jan 22 17:36:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:15Z|00444|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a down in Southbound
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.894 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:15Z|00445|binding|INFO|Removing iface tap4b5c1570-4e ovn-installed in OVS
Jan 22 17:36:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:15.908 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '8', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:15.910 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis#033[00m
Jan 22 17:36:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:15.911 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:36:15 np0005592767 nova_compute[182623]: 2026-01-22 22:36:15.911 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:15.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[de4d5b28-f542-4e92-b2c4-e904983ffdd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:15.912 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 namespace which is not needed anymore#033[00m
Jan 22 17:36:15 np0005592767 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 22 17:36:15 np0005592767 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000006e.scope: Consumed 6.235s CPU time.
Jan 22 17:36:15 np0005592767 systemd-machined[153912]: Machine qemu-58-instance-0000006e terminated.
Jan 22 17:36:16 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [NOTICE]   (227518) : haproxy version is 2.8.14-c23fe91
Jan 22 17:36:16 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [NOTICE]   (227518) : path to executable is /usr/sbin/haproxy
Jan 22 17:36:16 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [WARNING]  (227518) : Exiting Master process...
Jan 22 17:36:16 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [ALERT]    (227518) : Current worker (227520) exited with code 143 (Terminated)
Jan 22 17:36:16 np0005592767 neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2[227514]: [WARNING]  (227518) : All workers exited. Exiting... (0)
Jan 22 17:36:16 np0005592767 systemd[1]: libpod-ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c.scope: Deactivated successfully.
Jan 22 17:36:16 np0005592767 podman[227555]: 2026-01-22 22:36:16.057813961 +0000 UTC m=+0.048910876 container died ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:16 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c-userdata-shm.mount: Deactivated successfully.
Jan 22 17:36:16 np0005592767 NetworkManager[54973]: <info>  [1769121376.0913] manager: (tap4b5c1570-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Jan 22 17:36:16 np0005592767 kernel: tap4b5c1570-4e: entered promiscuous mode
Jan 22 17:36:16 np0005592767 systemd[1]: var-lib-containers-storage-overlay-3ef77e5ea87f33d5d2a76fb3fa690f80b7f4a95b2b048f5b930cdee9190873ce-merged.mount: Deactivated successfully.
Jan 22 17:36:16 np0005592767 kernel: tap4b5c1570-4e (unregistering): left promiscuous mode
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.149 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00446|binding|INFO|Claiming lport 4b5c1570-4e54-4f2b-a349-702a4160e13a for this chassis.
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00447|binding|INFO|4b5c1570-4e54-4f2b-a349-702a4160e13a: Claiming fa:16:3e:b1:99:38 10.100.0.13
Jan 22 17:36:16 np0005592767 podman[227555]: 2026-01-22 22:36:16.152018061 +0000 UTC m=+0.143114986 container cleanup ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.158 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '8', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:16 np0005592767 systemd[1]: libpod-conmon-ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c.scope: Deactivated successfully.
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00448|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a ovn-installed in OVS
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00449|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a up in Southbound
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.187 182627 INFO nova.virt.libvirt.driver [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Instance destroyed successfully.#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.188 182627 DEBUG nova.objects.instance [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lazy-loading 'resources' on Instance uuid f160acde-2aa8-4109-94ea-ba98aaf63ad3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.206 182627 DEBUG nova.virt.libvirt.vif [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:35:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1138865448',display_name='tempest-ServerStableDeviceRescueTest-server-1138865448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1138865448',id=110,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:36:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b1f07a8546648baba916fffc53a0b93',ramdisk_id='',reservation_id='r-6arbga1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-395714292',owner_user_name='tempest-ServerStableDeviceRescueTest-395714292-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:10Z,user_data=None,user_id='9d1e26d3056148e692e157703469d77a',uuid=f160acde-2aa8-4109-94ea-ba98aaf63ad3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.207 182627 DEBUG nova.network.os_vif_util [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converting VIF {"id": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "address": "fa:16:3e:b1:99:38", "network": {"id": "ad2345e3-0b74-4aee-aa42-da6620725bb2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1240878426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b1f07a8546648baba916fffc53a0b93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5c1570-4e", "ovs_interfaceid": "4b5c1570-4e54-4f2b-a349-702a4160e13a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.207 182627 DEBUG nova.network.os_vif_util [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.208 182627 DEBUG os_vif [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.210 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b5c1570-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.211 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00450|binding|INFO|Releasing lport 4b5c1570-4e54-4f2b-a349-702a4160e13a from this chassis (sb_readonly=0)
Jan 22 17:36:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:16Z|00451|binding|INFO|Setting lport 4b5c1570-4e54-4f2b-a349-702a4160e13a down in Southbound
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.213 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.216 182627 INFO os_vif [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:99:38,bridge_name='br-int',has_traffic_filtering=True,id=4b5c1570-4e54-4f2b-a349-702a4160e13a,network=Network(ad2345e3-0b74-4aee-aa42-da6620725bb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5c1570-4e')#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.217 182627 INFO nova.virt.libvirt.driver [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Deleting instance files /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3_del#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.218 182627 INFO nova.virt.libvirt.driver [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Deletion of /var/lib/nova/instances/f160acde-2aa8-4109-94ea-ba98aaf63ad3_del complete#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.218 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:99:38 10.100.0.13'], port_security=['fa:16:3e:b1:99:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f160acde-2aa8-4109-94ea-ba98aaf63ad3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b1f07a8546648baba916fffc53a0b93', 'neutron:revision_number': '8', 'neutron:security_group_ids': '69b0063a-58ee-4aa4-b0cf-6c3ee79813fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0ff06b3-66f3-4b94-b027-fc55f3af185e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4b5c1570-4e54-4f2b-a349-702a4160e13a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 podman[227592]: 2026-01-22 22:36:16.234331913 +0000 UTC m=+0.047415615 container remove ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.238 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1fed5917-19d7-47c4-9065-dc4d5b32c80f]: (4, ('Thu Jan 22 10:36:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c)\nff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c\nThu Jan 22 10:36:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 (ff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c)\nff52c9b9d273c9aca3dda04a7b2f34b9f42bf5b66e7488656f134adb5766406c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.240 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7fec1a-b865-4ba6-8a1d-ba889ebbc9f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.241 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2345e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.243 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 kernel: tapad2345e3-00: left promiscuous mode
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.254 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.257 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[db819fdf-6284-4260-8dc6-e5160d0dea35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.275 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5ef1e8-2747-4ffd-b359-f47f910fef1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.276 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bab487cf-ebcc-416c-950b-65c343bda64e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.291 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e32cd536-23b5-4f72-956d-b2eb86a46f24]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496630, 'reachable_time': 15572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227609, 'error': None, 'target': 'ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.293 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2345e3-0b74-4aee-aa42-da6620725bb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.293 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[0e40d248-95dc-444a-aa80-4ce1a52ecc02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.294 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis#033[00m
Jan 22 17:36:16 np0005592767 systemd[1]: run-netns-ovnmeta\x2dad2345e3\x2d0b74\x2d4aee\x2daa42\x2dda6620725bb2.mount: Deactivated successfully.
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.295 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.296 182627 DEBUG nova.compute.manager [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.296 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[49add648-e17c-468c-affb-72e713094f80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.296 182627 DEBUG oslo_concurrency.lockutils [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.296 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5c1570-4e54-4f2b-a349-702a4160e13a in datapath ad2345e3-0b74-4aee-aa42-da6620725bb2 unbound from our chassis#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.297 182627 DEBUG oslo_concurrency.lockutils [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.297 182627 DEBUG oslo_concurrency.lockutils [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.297 182627 DEBUG nova.compute.manager [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.297 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2345e3-0b74-4aee-aa42-da6620725bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.297 182627 DEBUG nova.compute.manager [req-02f79a3e-f224-4763-a180-29e580ebeb98 req-e03255da-f771-4895-82a6-c11eb2390912 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:36:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:16.298 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[014a5a80-60d0-4368-b85c-6a33a47f69c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.300 182627 INFO nova.compute.manager [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.301 182627 DEBUG oslo.service.loopingcall [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.302 182627 DEBUG nova.compute.manager [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.302 182627 DEBUG nova.network.neutron [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.592 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.877 182627 DEBUG nova.network.neutron [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.903 182627 INFO nova.compute.manager [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Took 0.60 seconds to deallocate network for instance.#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.980 182627 DEBUG nova.compute.manager [req-f3b7029b-0d58-4345-93d0-c8628820c3cf req-49d01764-bca0-47f9-b245-4e5e3b05f8c0 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-deleted-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.994 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:16 np0005592767 nova_compute[182623]: 2026-01-22 22:36:16.995 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.028 182627 DEBUG nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.050 182627 DEBUG nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.051 182627 DEBUG nova.compute.provider_tree [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.069 182627 DEBUG nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.114 182627 DEBUG nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.183 182627 DEBUG nova.compute.provider_tree [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.244 182627 DEBUG nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.278 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.313 182627 INFO nova.scheduler.client.report [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Deleted allocations for instance f160acde-2aa8-4109-94ea-ba98aaf63ad3#033[00m
Jan 22 17:36:17 np0005592767 nova_compute[182623]: 2026-01-22 22:36:17.455 182627 DEBUG oslo_concurrency.lockutils [None req-448e5a88-62ad-4175-9c92-8fddda27eb69 9d1e26d3056148e692e157703469d77a 9b1f07a8546648baba916fffc53a0b93 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.386 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.387 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.387 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.388 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.388 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.389 182627 WARNING nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.389 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.390 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.390 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.390 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.391 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.391 182627 WARNING nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.392 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.392 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.392 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.393 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.393 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.394 182627 WARNING nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.394 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.395 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.395 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.396 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.396 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.396 182627 WARNING nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-unplugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.397 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.397 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.398 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.398 182627 DEBUG oslo_concurrency.lockutils [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f160acde-2aa8-4109-94ea-ba98aaf63ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.399 182627 DEBUG nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] No waiting events found dispatching network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:18 np0005592767 nova_compute[182623]: 2026-01-22 22:36:18.399 182627 WARNING nova.compute.manager [req-12dc01d0-bfe9-4896-bcfd-7c75204410de req-3f215b6e-0147-46ee-be3b-7bda043a1f1f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Received unexpected event network-vif-plugged-4b5c1570-4e54-4f2b-a349-702a4160e13a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:36:19 np0005592767 podman[227610]: 2026-01-22 22:36:19.191897304 +0000 UTC m=+0.097500962 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:20 np0005592767 nova_compute[182623]: 2026-01-22 22:36:20.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.213 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.593 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.926 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:36:21 np0005592767 nova_compute[182623]: 2026-01-22 22:36:21.926 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:22 np0005592767 nova_compute[182623]: 2026-01-22 22:36:22.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:23 np0005592767 nova_compute[182623]: 2026-01-22 22:36:23.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:23 np0005592767 nova_compute[182623]: 2026-01-22 22:36:23.917 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:23 np0005592767 nova_compute[182623]: 2026-01-22 22:36:23.917 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:23 np0005592767 nova_compute[182623]: 2026-01-22 22:36:23.918 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:23 np0005592767 nova_compute[182623]: 2026-01-22 22:36:23.918 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.177 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.179 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5670MB free_disk=73.1966667175293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.179 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.179 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.232 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.233 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.314 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.331 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.355 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:36:24 np0005592767 nova_compute[182623]: 2026-01-22 22:36:24.356 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:25 np0005592767 podman[227634]: 2026-01-22 22:36:25.186108947 +0000 UTC m=+0.093409299 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Jan 22 17:36:25 np0005592767 podman[227633]: 2026-01-22 22:36:25.219122152 +0000 UTC m=+0.126104775 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:25 np0005592767 nova_compute[182623]: 2026-01-22 22:36:25.355 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:25 np0005592767 nova_compute[182623]: 2026-01-22 22:36:25.356 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:25 np0005592767 nova_compute[182623]: 2026-01-22 22:36:25.356 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:36:26 np0005592767 nova_compute[182623]: 2026-01-22 22:36:26.215 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:26 np0005592767 nova_compute[182623]: 2026-01-22 22:36:26.596 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:26 np0005592767 nova_compute[182623]: 2026-01-22 22:36:26.861 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:26 np0005592767 nova_compute[182623]: 2026-01-22 22:36:26.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:27 np0005592767 nova_compute[182623]: 2026-01-22 22:36:27.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:27 np0005592767 nova_compute[182623]: 2026-01-22 22:36:27.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.177 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.177 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.201 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.321 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.322 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.331 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.332 182627 INFO nova.compute.claims [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.456 182627 DEBUG nova.compute.provider_tree [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.471 182627 DEBUG nova.scheduler.client.report [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.492 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.494 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.542 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.543 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.562 182627 INFO nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.579 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.682 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.684 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.685 182627 INFO nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Creating image(s)#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.686 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.686 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.687 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.712 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.800 182627 DEBUG nova.policy [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10767689cb2d4ee383920e3d388a6dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '25a5678696f747b3ac42324626646e40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.818 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.819 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.820 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.842 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.937 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.938 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.984 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.986 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:30 np0005592767 nova_compute[182623]: 2026-01-22 22:36:30.987 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.045 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.047 182627 DEBUG nova.virt.disk.api [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Checking if we can resize image /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.047 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.105 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.107 182627 DEBUG nova.virt.disk.api [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Cannot resize image /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.107 182627 DEBUG nova.objects.instance [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'migration_context' on Instance uuid a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.121 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.122 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Ensure instance console log exists: /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.122 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.122 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.123 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:31 np0005592767 podman[227695]: 2026-01-22 22:36:31.151356358 +0000 UTC m=+0.062399881 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:36:31 np0005592767 podman[227694]: 2026-01-22 22:36:31.177245765 +0000 UTC m=+0.088361500 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.186 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121376.185964, f160acde-2aa8-4109-94ea-ba98aaf63ad3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.187 182627 INFO nova.compute.manager [-] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.217 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.219 182627 DEBUG nova.compute.manager [None req-2b4b1385-e470-4158-9fda-a74cd6f3799e - - - - - -] [instance: f160acde-2aa8-4109-94ea-ba98aaf63ad3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.610 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Successfully created port: f974cce2-59f5-4552-a588-26d630ab871a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:36:31 np0005592767 nova_compute[182623]: 2026-01-22 22:36:31.637 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.670 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Successfully updated port: f974cce2-59f5-4552-a588-26d630ab871a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.695 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.695 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquired lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.695 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.767 182627 DEBUG nova.compute.manager [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-changed-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.767 182627 DEBUG nova.compute.manager [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Refreshing instance network info cache due to event network-changed-f974cce2-59f5-4552-a588-26d630ab871a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.768 182627 DEBUG oslo_concurrency.lockutils [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:36:32 np0005592767 nova_compute[182623]: 2026-01-22 22:36:32.941 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.025 182627 DEBUG nova.network.neutron [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Updating instance_info_cache with network_info: [{"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.054 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Releasing lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.054 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Instance network_info: |[{"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.055 182627 DEBUG oslo_concurrency.lockutils [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.055 182627 DEBUG nova.network.neutron [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Refreshing network info cache for port f974cce2-59f5-4552-a588-26d630ab871a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.057 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Start _get_guest_xml network_info=[{"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.062 182627 WARNING nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.066 182627 DEBUG nova.virt.libvirt.host [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.067 182627 DEBUG nova.virt.libvirt.host [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.070 182627 DEBUG nova.virt.libvirt.host [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.071 182627 DEBUG nova.virt.libvirt.host [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.072 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.072 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.072 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.072 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.073 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.073 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.073 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.073 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.074 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.074 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.074 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.074 182627 DEBUG nova.virt.hardware [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.077 182627 DEBUG nova.virt.libvirt.vif [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-764702493',display_name='tempest-ServersTestJSON-server-764702493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-764702493',id=115,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-95btt9xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:36:30Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.078 182627 DEBUG nova.network.os_vif_util [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.078 182627 DEBUG nova.network.os_vif_util [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.079 182627 DEBUG nova.objects.instance [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'pci_devices' on Instance uuid a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.094 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <uuid>a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd</uuid>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <name>instance-00000073</name>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersTestJSON-server-764702493</nova:name>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:36:34</nova:creationTime>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:user uuid="10767689cb2d4ee383920e3d388a6dfe">tempest-ServersTestJSON-1676167595-project-member</nova:user>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:project uuid="25a5678696f747b3ac42324626646e40">tempest-ServersTestJSON-1676167595</nova:project>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        <nova:port uuid="f974cce2-59f5-4552-a588-26d630ab871a">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="serial">a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="uuid">a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.config"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:b8:66:ea"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <target dev="tapf974cce2-59"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/console.log" append="off"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:36:34 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:36:34 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:36:34 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:36:34 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.096 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Preparing to wait for external event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.097 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.097 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.098 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.099 182627 DEBUG nova.virt.libvirt.vif [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-764702493',display_name='tempest-ServersTestJSON-server-764702493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-764702493',id=115,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-95btt9xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:36:30Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.099 182627 DEBUG nova.network.os_vif_util [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.100 182627 DEBUG nova.network.os_vif_util [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.101 182627 DEBUG os_vif [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.102 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.102 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.103 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.106 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf974cce2-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.106 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf974cce2-59, col_values=(('external_ids', {'iface-id': 'f974cce2-59f5-4552-a588-26d630ab871a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:66:ea', 'vm-uuid': 'a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.1108] manager: (tapf974cce2-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.119 182627 INFO os_vif [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59')#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.167 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.167 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.168 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] No VIF found with MAC fa:16:3e:b8:66:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.168 182627 INFO nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Using config drive#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.560 182627 INFO nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Creating config drive at /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.config#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.569 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2wql42tb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.712 182627 DEBUG oslo_concurrency.processutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2wql42tb" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:36:34 np0005592767 kernel: tapf974cce2-59: entered promiscuous mode
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.7915] manager: (tapf974cce2-59): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 22 17:36:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:34Z|00452|binding|INFO|Claiming lport f974cce2-59f5-4552-a588-26d630ab871a for this chassis.
Jan 22 17:36:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:34Z|00453|binding|INFO|f974cce2-59f5-4552-a588-26d630ab871a: Claiming fa:16:3e:b8:66:ea 10.100.0.8
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.799 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.813 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:66:ea 10.100.0.8'], port_security=['fa:16:3e:b8:66:ea 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=f974cce2-59f5-4552-a588-26d630ab871a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.815 104135 INFO neutron.agent.ovn.metadata.agent [-] Port f974cce2-59f5-4552-a588-26d630ab871a in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 bound to our chassis#033[00m
Jan 22 17:36:34 np0005592767 systemd-udevd[227755]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.818 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.830 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e72f208a-019b-4340-a640-93131851993a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.831 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8cfbdc2a-d1 in ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.834 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8cfbdc2a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.834 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6656c076-aa07-493b-98f4-19da0b872b6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.836 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e83b2f6c-3510-4e96-baa5-eb263d7ab3c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.8390] device (tapf974cce2-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.8402] device (tapf974cce2-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:36:34 np0005592767 systemd-machined[153912]: New machine qemu-59-instance-00000073.
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.846 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ca0ee5-7031-4eb1-b634-59fc8779dea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.880 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7734b5e0-99a7-4d45-85dc-d95558af66dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:34Z|00454|binding|INFO|Setting lport f974cce2-59f5-4552-a588-26d630ab871a ovn-installed in OVS
Jan 22 17:36:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:34Z|00455|binding|INFO|Setting lport f974cce2-59f5-4552-a588-26d630ab871a up in Southbound
Jan 22 17:36:34 np0005592767 systemd[1]: Started Virtual Machine qemu-59-instance-00000073.
Jan 22 17:36:34 np0005592767 nova_compute[182623]: 2026-01-22 22:36:34.891 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.911 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7506e503-f337-4cd6-92b5-292de5bc0540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 systemd-udevd[227760]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.9191] manager: (tap8cfbdc2a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.919 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[64f0f72c-ec88-455d-ba75-0e7a72f164a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.956 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4f00b731-3eb0-4d78-89d0-19bbfc19e9fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.960 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[af1d2e58-2c01-4a4d-81cd-538076618843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:34 np0005592767 NetworkManager[54973]: <info>  [1769121394.9858] device (tap8cfbdc2a-d0): carrier: link connected
Jan 22 17:36:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:34.992 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e4425472-dbf2-48af-8dc4-0f167f3d6911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.013 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6d5e80-c65e-4a6f-88ec-7b0ad4d25eb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499156, 'reachable_time': 19735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227790, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.036 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9f40f5-75ae-41a2-a7fd-9434d5ad826f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:b87b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499156, 'tstamp': 499156}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227791, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.064 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1dac13fd-5b8e-4ec3-afdf-d243a753bda4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cfbdc2a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b8:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499156, 'reachable_time': 19735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227792, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.106 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb07deb-82b5-4c70-9a78-515976c92784]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.170 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121395.1690896, a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.170 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] VM Started (Lifecycle Event)#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.172 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc66201-9b0e-4c4d-8812-176adb7db04c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.174 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.174 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.175 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cfbdc2a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.175 182627 DEBUG nova.network.neutron [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Updated VIF entry in instance network info cache for port f974cce2-59f5-4552-a588-26d630ab871a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.176 182627 DEBUG nova.network.neutron [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Updating instance_info_cache with network_info: [{"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:35 np0005592767 NetworkManager[54973]: <info>  [1769121395.2235] manager: (tap8cfbdc2a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 22 17:36:35 np0005592767 kernel: tap8cfbdc2a-d0: entered promiscuous mode
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.225 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cfbdc2a-d0, col_values=(('external_ids', {'iface-id': '48ec79a0-32c2-47c3-bffb-8836aa917258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.226 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:35Z|00456|binding|INFO|Releasing lport 48ec79a0-32c2-47c3-bffb-8836aa917258 from this chassis (sb_readonly=0)
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.227 182627 DEBUG oslo_concurrency.lockutils [req-73973dd8-85e6-4b09-8763-1d5fd017f1a8 req-0ef0f1e7-9b0b-4cac-a840-5805ec76ca12 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.228 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.229 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.229 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[923058f7-ca31-4420-9bbd-348b91aae75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.230 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.pid.haproxy
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 8cfbdc2a-d644-40be-b1e2-2d2471aaf695
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:36:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:35.231 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'env', 'PROCESS_TAG=haproxy-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8cfbdc2a-d644-40be-b1e2-2d2471aaf695.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.235 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121395.1693482, a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.235 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.238 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.254 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.258 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.273 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:36:35 np0005592767 podman[227831]: 2026-01-22 22:36:35.730386313 +0000 UTC m=+0.067038739 container create 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.774 182627 DEBUG nova.compute.manager [req-bac24a7c-c741-48f0-8044-5339c639de9a req-17e2a5e8-30cd-4bec-9550-d1924c25720f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.775 182627 DEBUG oslo_concurrency.lockutils [req-bac24a7c-c741-48f0-8044-5339c639de9a req-17e2a5e8-30cd-4bec-9550-d1924c25720f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.775 182627 DEBUG oslo_concurrency.lockutils [req-bac24a7c-c741-48f0-8044-5339c639de9a req-17e2a5e8-30cd-4bec-9550-d1924c25720f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.776 182627 DEBUG oslo_concurrency.lockutils [req-bac24a7c-c741-48f0-8044-5339c639de9a req-17e2a5e8-30cd-4bec-9550-d1924c25720f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.776 182627 DEBUG nova.compute.manager [req-bac24a7c-c741-48f0-8044-5339c639de9a req-17e2a5e8-30cd-4bec-9550-d1924c25720f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Processing event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:36:35 np0005592767 systemd[1]: Started libpod-conmon-21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39.scope.
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.777 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.783 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121395.783641, a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.784 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.786 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.790 182627 INFO nova.virt.libvirt.driver [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Instance spawned successfully.#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.790 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:36:35 np0005592767 podman[227831]: 2026-01-22 22:36:35.700961868 +0000 UTC m=+0.037614264 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:36:35 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:36:35 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a723b82dc43613ac8b9beaa9170e589dea38541e5140c5dfac82ca4faf9e2eaa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.824 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:35 np0005592767 podman[227831]: 2026-01-22 22:36:35.830199599 +0000 UTC m=+0.166851995 container init 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.832 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:36:35 np0005592767 podman[227831]: 2026-01-22 22:36:35.836552365 +0000 UTC m=+0.173204751 container start 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.837 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.837 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.838 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.838 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.839 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.840 182627 DEBUG nova.virt.libvirt.driver [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:36:35 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [NOTICE]   (227850) : New worker (227852) forked
Jan 22 17:36:35 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [NOTICE]   (227850) : Loading success.
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.876 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.911 182627 INFO nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Took 5.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:36:35 np0005592767 nova_compute[182623]: 2026-01-22 22:36:35.911 182627 DEBUG nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:36:36 np0005592767 nova_compute[182623]: 2026-01-22 22:36:36.004 182627 INFO nova.compute.manager [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Took 5.74 seconds to build instance.#033[00m
Jan 22 17:36:36 np0005592767 nova_compute[182623]: 2026-01-22 22:36:36.023 182627 DEBUG oslo_concurrency.lockutils [None req-b298b5eb-3174-4d23-975d-475cdd82c857 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:36 np0005592767 nova_compute[182623]: 2026-01-22 22:36:36.640 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.884 182627 DEBUG nova.compute.manager [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.885 182627 DEBUG oslo_concurrency.lockutils [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.885 182627 DEBUG oslo_concurrency.lockutils [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.886 182627 DEBUG oslo_concurrency.lockutils [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.886 182627 DEBUG nova.compute.manager [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] No waiting events found dispatching network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:37 np0005592767 nova_compute[182623]: 2026-01-22 22:36:37.886 182627 WARNING nova.compute.manager [req-5c62f524-88b3-4ac7-a6ad-7852e0457250 req-a3da33ea-4463-4787-aea7-74fe76ca9ddc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received unexpected event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:36:39 np0005592767 nova_compute[182623]: 2026-01-22 22:36:39.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:39 np0005592767 podman[227861]: 2026-01-22 22:36:39.165401816 +0000 UTC m=+0.076640815 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:36:41 np0005592767 nova_compute[182623]: 2026-01-22 22:36:41.642 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:44 np0005592767 nova_compute[182623]: 2026-01-22 22:36:44.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:46 np0005592767 nova_compute[182623]: 2026-01-22 22:36:46.652 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:48Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:66:ea 10.100.0.8
Jan 22 17:36:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:48Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:66:ea 10.100.0.8
Jan 22 17:36:49 np0005592767 nova_compute[182623]: 2026-01-22 22:36:49.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:50 np0005592767 podman[227906]: 2026-01-22 22:36:50.191667697 +0000 UTC m=+0.095520348 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:36:51 np0005592767 nova_compute[182623]: 2026-01-22 22:36:51.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:54 np0005592767 nova_compute[182623]: 2026-01-22 22:36:54.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:55 np0005592767 nova_compute[182623]: 2026-01-22 22:36:55.996 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:55 np0005592767 nova_compute[182623]: 2026-01-22 22:36:55.997 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:55 np0005592767 nova_compute[182623]: 2026-01-22 22:36:55.998 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:55 np0005592767 nova_compute[182623]: 2026-01-22 22:36:55.998 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:55 np0005592767 nova_compute[182623]: 2026-01-22 22:36:55.999 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.022 182627 INFO nova.compute.manager [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Terminating instance#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.035 182627 DEBUG nova.compute.manager [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:36:56 np0005592767 kernel: tapf974cce2-59 (unregistering): left promiscuous mode
Jan 22 17:36:56 np0005592767 NetworkManager[54973]: <info>  [1769121416.0633] device (tapf974cce2-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:36:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:56Z|00457|binding|INFO|Releasing lport f974cce2-59f5-4552-a588-26d630ab871a from this chassis (sb_readonly=0)
Jan 22 17:36:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:56Z|00458|binding|INFO|Setting lport f974cce2-59f5-4552-a588-26d630ab871a down in Southbound
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:36:56Z|00459|binding|INFO|Removing iface tapf974cce2-59 ovn-installed in OVS
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.083 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:66:ea 10.100.0.8'], port_security=['fa:16:3e:b8:66:ea 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a5678696f747b3ac42324626646e40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4623b9c-88d2-4fe4-8e2e-2e7f844f0cc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496a638d-9ea1-4fa8-a40b-e23f20cd36c0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=f974cce2-59f5-4552-a588-26d630ab871a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.084 104135 INFO neutron.agent.ovn.metadata.agent [-] Port f974cce2-59f5-4552-a588-26d630ab871a in datapath 8cfbdc2a-d644-40be-b1e2-2d2471aaf695 unbound from our chassis#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.086 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cfbdc2a-d644-40be-b1e2-2d2471aaf695, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.088 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e24faac6-0c56-451a-ab2f-1edcc9371ced]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.089 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 namespace which is not needed anymore#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.101 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000073.scope: Deactivated successfully.
Jan 22 17:36:56 np0005592767 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000073.scope: Consumed 12.468s CPU time.
Jan 22 17:36:56 np0005592767 systemd-machined[153912]: Machine qemu-59-instance-00000073 terminated.
Jan 22 17:36:56 np0005592767 podman[227929]: 2026-01-22 22:36:56.167933843 +0000 UTC m=+0.076915273 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:36:56 np0005592767 podman[227926]: 2026-01-22 22:36:56.173054745 +0000 UTC m=+0.090401727 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 17:36:56 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [NOTICE]   (227850) : haproxy version is 2.8.14-c23fe91
Jan 22 17:36:56 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [NOTICE]   (227850) : path to executable is /usr/sbin/haproxy
Jan 22 17:36:56 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [WARNING]  (227850) : Exiting Master process...
Jan 22 17:36:56 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [ALERT]    (227850) : Current worker (227852) exited with code 143 (Terminated)
Jan 22 17:36:56 np0005592767 neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695[227846]: [WARNING]  (227850) : All workers exited. Exiting... (0)
Jan 22 17:36:56 np0005592767 systemd[1]: libpod-21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39.scope: Deactivated successfully.
Jan 22 17:36:56 np0005592767 podman[227996]: 2026-01-22 22:36:56.220074858 +0000 UTC m=+0.040679739 container died 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:36:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39-userdata-shm.mount: Deactivated successfully.
Jan 22 17:36:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay-a723b82dc43613ac8b9beaa9170e589dea38541e5140c5dfac82ca4faf9e2eaa-merged.mount: Deactivated successfully.
Jan 22 17:36:56 np0005592767 podman[227996]: 2026-01-22 22:36:56.256988741 +0000 UTC m=+0.077593612 container cleanup 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:36:56 np0005592767 systemd[1]: libpod-conmon-21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39.scope: Deactivated successfully.
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.288 182627 DEBUG nova.compute.manager [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-unplugged-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.289 182627 DEBUG oslo_concurrency.lockutils [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.289 182627 DEBUG oslo_concurrency.lockutils [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.290 182627 DEBUG oslo_concurrency.lockutils [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.290 182627 DEBUG nova.compute.manager [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] No waiting events found dispatching network-vif-unplugged-f974cce2-59f5-4552-a588-26d630ab871a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.290 182627 DEBUG nova.compute.manager [req-8b7ad55d-1a98-43b6-86e8-ee023c8fd209 req-16079990-987f-411c-b044-efc65bd439bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-unplugged-f974cce2-59f5-4552-a588-26d630ab871a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.303 182627 INFO nova.virt.libvirt.driver [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Instance destroyed successfully.#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.305 182627 DEBUG nova.objects.instance [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lazy-loading 'resources' on Instance uuid a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.321 182627 DEBUG nova.virt.libvirt.vif [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-764702493',display_name='tempest-ServersTestJSON-server-764702493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-764702493',id=115,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:36:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='25a5678696f747b3ac42324626646e40',ramdisk_id='',reservation_id='r-95btt9xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1676167595',owner_user_name='tempest-ServersTestJSON-1676167595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:36:35Z,user_data=None,user_id='10767689cb2d4ee383920e3d388a6dfe',uuid=a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:36:56 np0005592767 podman[228027]: 2026-01-22 22:36:56.323499684 +0000 UTC m=+0.045972305 container remove 21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.323 182627 DEBUG nova.network.os_vif_util [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converting VIF {"id": "f974cce2-59f5-4552-a588-26d630ab871a", "address": "fa:16:3e:b8:66:ea", "network": {"id": "8cfbdc2a-d644-40be-b1e2-2d2471aaf695", "bridge": "br-int", "label": "tempest-ServersTestJSON-439578983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a5678696f747b3ac42324626646e40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf974cce2-59", "ovs_interfaceid": "f974cce2-59f5-4552-a588-26d630ab871a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.324 182627 DEBUG nova.network.os_vif_util [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.325 182627 DEBUG os_vif [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.327 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.327 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf974cce2-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.328 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[da93927e-83d2-46d5-bcfc-b50f0aba2893]: (4, ('Thu Jan 22 10:36:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 (21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39)\n21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39\nThu Jan 22 10:36:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 (21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39)\n21c8cd0130422013ec6ec92d993683480d7189e5995aeba3292b0efd470dbd39\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.330 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.332 182627 INFO os_vif [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:66:ea,bridge_name='br-int',has_traffic_filtering=True,id=f974cce2-59f5-4552-a588-26d630ab871a,network=Network(8cfbdc2a-d644-40be-b1e2-2d2471aaf695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf974cce2-59')#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.333 182627 INFO nova.virt.libvirt.driver [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Deleting instance files /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd_del#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.333 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6424dd84-3833-4b40-8e93-3a47bd6b2032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.333 182627 INFO nova.virt.libvirt.driver [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Deletion of /var/lib/nova/instances/a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd_del complete#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.334 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cfbdc2a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.335 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 kernel: tap8cfbdc2a-d0: left promiscuous mode
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.347 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.350 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fdde0df2-f9b1-4cef-8047-258a3058f9bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.369 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[35e22af6-a5f7-437e-934f-bf2b53f994fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.371 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[368df121-2666-400c-af36-f7c58656f6e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.388 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7938eb-d145-4566-bec9-7376eb58a477]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499148, 'reachable_time': 26817, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228053, 'error': None, 'target': 'ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.392 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8cfbdc2a-d644-40be-b1e2-2d2471aaf695 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:36:56 np0005592767 systemd[1]: run-netns-ovnmeta\x2d8cfbdc2a\x2dd644\x2d40be\x2db1e2\x2d2d2471aaf695.mount: Deactivated successfully.
Jan 22 17:36:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:36:56.392 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[58313bc4-0228-4ebb-aa28-378b96b76bcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.414 182627 INFO nova.compute.manager [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.414 182627 DEBUG oslo.service.loopingcall [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.415 182627 DEBUG nova.compute.manager [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.415 182627 DEBUG nova.network.neutron [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:36:56 np0005592767 nova_compute[182623]: 2026-01-22 22:36:56.658 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.692 182627 DEBUG nova.network.neutron [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.709 182627 INFO nova.compute.manager [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Took 1.29 seconds to deallocate network for instance.#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.816 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.817 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.892 182627 DEBUG nova.compute.provider_tree [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.906 182627 DEBUG nova.scheduler.client.report [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.923 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.949 182627 INFO nova.scheduler.client.report [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Deleted allocations for instance a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd#033[00m
Jan 22 17:36:57 np0005592767 nova_compute[182623]: 2026-01-22 22:36:57.975 182627 DEBUG nova.compute.manager [req-51b9572d-b9f3-4afb-ad81-b56cf1f41162 req-3d282cb0-94e6-4234-95a8-db444ffe8748 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-deleted-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.029 182627 DEBUG oslo_concurrency.lockutils [None req-21fead83-50c5-467c-b11a-d543bfa8a022 10767689cb2d4ee383920e3d388a6dfe 25a5678696f747b3ac42324626646e40 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.345 182627 DEBUG nova.compute.manager [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.346 182627 DEBUG oslo_concurrency.lockutils [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.346 182627 DEBUG oslo_concurrency.lockutils [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.346 182627 DEBUG oslo_concurrency.lockutils [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.347 182627 DEBUG nova.compute.manager [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] No waiting events found dispatching network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:36:58 np0005592767 nova_compute[182623]: 2026-01-22 22:36:58.347 182627 WARNING nova.compute.manager [req-e11001ad-3aea-48ec-b6b4-40d6b0000dce req-b877fc51-847a-48b6-9e48-f96ad7700502 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Received unexpected event network-vif-plugged-f974cce2-59f5-4552-a588-26d630ab871a for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.511 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.512 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.528 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.611 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.612 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.617 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.617 182627 INFO nova.compute.claims [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.902 182627 DEBUG nova.compute.provider_tree [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.919 182627 DEBUG nova.scheduler.client.report [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.962 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:00 np0005592767 nova_compute[182623]: 2026-01-22 22:37:00.962 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.022 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.023 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.041 182627 INFO nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.058 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.160 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.162 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.162 182627 INFO nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Creating image(s)#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.163 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.163 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.164 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.178 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.231 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.232 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.232 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.242 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.292 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.293 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.324 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.324 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.325 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.340 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.377 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.377 182627 DEBUG nova.virt.disk.api [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Checking if we can resize image /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.378 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.429 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.430 182627 DEBUG nova.virt.disk.api [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Cannot resize image /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.431 182627 DEBUG nova.objects.instance [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'migration_context' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.445 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.445 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Ensure instance console log exists: /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.446 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.446 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.447 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.660 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:01 np0005592767 nova_compute[182623]: 2026-01-22 22:37:01.742 182627 DEBUG nova.policy [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:37:02 np0005592767 podman[228071]: 2026-01-22 22:37:02.132963366 +0000 UTC m=+0.047983421 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:37:02 np0005592767 podman[228070]: 2026-01-22 22:37:02.157028613 +0000 UTC m=+0.076362777 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:37:03 np0005592767 nova_compute[182623]: 2026-01-22 22:37:03.644 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Successfully created port: 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:37:04 np0005592767 nova_compute[182623]: 2026-01-22 22:37:04.977 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Successfully updated port: 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:37:04 np0005592767 nova_compute[182623]: 2026-01-22 22:37:04.997 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:04 np0005592767 nova_compute[182623]: 2026-01-22 22:37:04.998 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:04 np0005592767 nova_compute[182623]: 2026-01-22 22:37:04.999 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:37:05 np0005592767 nova_compute[182623]: 2026-01-22 22:37:05.117 182627 DEBUG nova.compute.manager [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:05 np0005592767 nova_compute[182623]: 2026-01-22 22:37:05.117 182627 DEBUG nova.compute.manager [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing instance network info cache due to event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:37:05 np0005592767 nova_compute[182623]: 2026-01-22 22:37:05.118 182627 DEBUG oslo_concurrency.lockutils [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:05 np0005592767 nova_compute[182623]: 2026-01-22 22:37:05.215 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:37:06 np0005592767 nova_compute[182623]: 2026-01-22 22:37:06.341 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:06 np0005592767 nova_compute[182623]: 2026-01-22 22:37:06.663 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.066 182627 DEBUG nova.network.neutron [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.220 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.221 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance network_info: |[{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.222 182627 DEBUG oslo_concurrency.lockutils [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.222 182627 DEBUG nova.network.neutron [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.227 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Start _get_guest_xml network_info=[{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.232 182627 WARNING nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.238 182627 DEBUG nova.virt.libvirt.host [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.239 182627 DEBUG nova.virt.libvirt.host [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.242 182627 DEBUG nova.virt.libvirt.host [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.243 182627 DEBUG nova.virt.libvirt.host [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.244 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.244 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.245 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.245 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.246 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.246 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.246 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.246 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.247 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.247 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.247 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.248 182627 DEBUG nova.virt.hardware [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.252 182627 DEBUG nova.virt.libvirt.vif [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:36:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1990766256',display_name='tempest-ServerActionsTestOtherB-server-1990766256',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1990766256',id=119,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-rgsieo8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=17a24497-f021-486d-8f08-892c79ea1d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.252 182627 DEBUG nova.network.os_vif_util [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.253 182627 DEBUG nova.network.os_vif_util [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.255 182627 DEBUG nova.objects.instance [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'pci_devices' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.270 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <uuid>17a24497-f021-486d-8f08-892c79ea1d31</uuid>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <name>instance-00000077</name>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerActionsTestOtherB-server-1990766256</nova:name>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:37:07</nova:creationTime>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:user uuid="8b15fdf3e23640a2b9579790941bb346">tempest-ServerActionsTestOtherB-1598778832-project-member</nova:user>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:project uuid="abdd987d004046138277253df8658aca">tempest-ServerActionsTestOtherB-1598778832</nova:project>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        <nova:port uuid="570f7f3c-2f2a-4f46-87ea-784d78f09cc1">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="serial">17a24497-f021-486d-8f08-892c79ea1d31</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="uuid">17a24497-f021-486d-8f08-892c79ea1d31</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:50:26:52"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <target dev="tap570f7f3c-2f"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/console.log" append="off"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:37:07 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:37:07 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:37:07 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:37:07 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.271 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Preparing to wait for external event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.271 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.271 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.272 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.272 182627 DEBUG nova.virt.libvirt.vif [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:36:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1990766256',display_name='tempest-ServerActionsTestOtherB-server-1990766256',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1990766256',id=119,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-rgsieo8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:37:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=17a24497-f021-486d-8f08-892c79ea1d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.273 182627 DEBUG nova.network.os_vif_util [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.273 182627 DEBUG nova.network.os_vif_util [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.274 182627 DEBUG os_vif [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.274 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.274 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.275 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.277 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.277 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap570f7f3c-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.278 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap570f7f3c-2f, col_values=(('external_ids', {'iface-id': '570f7f3c-2f2a-4f46-87ea-784d78f09cc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:26:52', 'vm-uuid': '17a24497-f021-486d-8f08-892c79ea1d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:07 np0005592767 NetworkManager[54973]: <info>  [1769121427.2802] manager: (tap570f7f3c-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.286 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.286 182627 INFO os_vif [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f')#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.413 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.413 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.413 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No VIF found with MAC fa:16:3e:50:26:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.414 182627 INFO nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Using config drive#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.780 182627 INFO nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Creating config drive at /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.786 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6jaxp_k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:07 np0005592767 nova_compute[182623]: 2026-01-22 22:37:07.911 182627 DEBUG oslo_concurrency.processutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6jaxp_k" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:07 np0005592767 kernel: tap570f7f3c-2f: entered promiscuous mode
Jan 22 17:37:07 np0005592767 NetworkManager[54973]: <info>  [1769121427.9685] manager: (tap570f7f3c-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.021 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:08Z|00460|binding|INFO|Claiming lport 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for this chassis.
Jan 22 17:37:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:08Z|00461|binding|INFO|570f7f3c-2f2a-4f46-87ea-784d78f09cc1: Claiming fa:16:3e:50:26:52 10.100.0.11
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.023 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.028 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.039 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:52 10.100.0.11'], port_security=['fa:16:3e:50:26:52 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b9a45c4-3bd4-4f5f-b26b-5b1ab95bdd58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=570f7f3c-2f2a-4f46-87ea-784d78f09cc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.041 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 bound to our chassis#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.042 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84d8b010-d968-4df4-bedf-0c350ae42113#033[00m
Jan 22 17:37:08 np0005592767 systemd-udevd[228132]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.053 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18ab4bd6-1013-49c9-b921-9804f7808763]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.054 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap84d8b010-d1 in ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.056 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap84d8b010-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.056 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc254c25-1995-4b4f-8f9a-5a9a31880f01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.057 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[545d1ab9-f346-429a-8b0e-8467757231da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 systemd-machined[153912]: New machine qemu-60-instance-00000077.
Jan 22 17:37:08 np0005592767 NetworkManager[54973]: <info>  [1769121428.0676] device (tap570f7f3c-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:37:08 np0005592767 NetworkManager[54973]: <info>  [1769121428.0682] device (tap570f7f3c-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.068 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e853a095-9307-41a7-a135-e1b42186df3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.079 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 systemd[1]: Started Virtual Machine qemu-60-instance-00000077.
Jan 22 17:37:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:08Z|00462|binding|INFO|Setting lport 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 ovn-installed in OVS
Jan 22 17:37:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:08Z|00463|binding|INFO|Setting lport 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 up in Southbound
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.083 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3c868c87-c5ed-428c-a2d2-a277d3b4a12d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.086 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.113 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[92c8d15d-e6e5-4dc2-98d7-15516be3c22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 NetworkManager[54973]: <info>  [1769121428.1190] manager: (tap84d8b010-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.118 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1c4702-89a0-4ffb-9f0c-b36a493b1d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 systemd-udevd[228136]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.147 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[29d312b6-f973-4881-8633-3af997661523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.150 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2517ae36-980e-4c86-b3d5-24a3734eb2b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 NetworkManager[54973]: <info>  [1769121428.1705] device (tap84d8b010-d0): carrier: link connected
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.175 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[97995c64-4b59-4b15-b4f9-3d9b2b7982c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.191 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[43ac30cb-e521-4005-b5cf-b537aaf0dd91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502474, 'reachable_time': 17322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228165, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.206 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ef34c872-7b7d-45e6-8c76-c339bb6976da]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:3d39'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502474, 'tstamp': 502474}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228166, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.225 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[80aba783-c7c5-4ccb-b7e8-62cd60415e77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502474, 'reachable_time': 17322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228167, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.253 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[01f2be3f-390f-476f-84ac-3c1fd95cb53d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.307 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[df18b4d7-65c7-4c59-9ff2-b0a94b0d3e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.309 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.309 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.310 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84d8b010-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:08 np0005592767 NetworkManager[54973]: <info>  [1769121428.3120] manager: (tap84d8b010-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.311 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 kernel: tap84d8b010-d0: entered promiscuous mode
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.315 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.316 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84d8b010-d0, col_values=(('external_ids', {'iface-id': '8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:08 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:08Z|00464|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.322 182627 DEBUG nova.compute.manager [req-257d25c0-277b-4609-9b5d-abff6894c429 req-8b791474-3d10-4968-a609-ef8ee2a12023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.323 182627 DEBUG oslo_concurrency.lockutils [req-257d25c0-277b-4609-9b5d-abff6894c429 req-8b791474-3d10-4968-a609-ef8ee2a12023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.323 182627 DEBUG oslo_concurrency.lockutils [req-257d25c0-277b-4609-9b5d-abff6894c429 req-8b791474-3d10-4968-a609-ef8ee2a12023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.323 182627 DEBUG oslo_concurrency.lockutils [req-257d25c0-277b-4609-9b5d-abff6894c429 req-8b791474-3d10-4968-a609-ef8ee2a12023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.346 182627 DEBUG nova.compute.manager [req-257d25c0-277b-4609-9b5d-abff6894c429 req-8b791474-3d10-4968-a609-ef8ee2a12023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Processing event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.347 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.350 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.352 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.353 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6e4536-a5c2-41fa-bf55-ae2e96315c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.354 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:37:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:08.354 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'env', 'PROCESS_TAG=haproxy-84d8b010-d968-4df4-bedf-0c350ae42113', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/84d8b010-d968-4df4-bedf-0c350ae42113.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.569 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121428.5680053, 17a24497-f021-486d-8f08-892c79ea1d31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.569 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] VM Started (Lifecycle Event)#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.579 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.582 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.586 182627 INFO nova.virt.libvirt.driver [-] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance spawned successfully.#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.587 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.593 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.598 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.610 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.611 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.612 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.612 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.613 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.614 182627 DEBUG nova.virt.libvirt.driver [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.620 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.621 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121428.568519, 17a24497-f021-486d-8f08-892c79ea1d31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.622 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.794 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.798 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121428.581817, 17a24497-f021-486d-8f08-892c79ea1d31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.799 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:37:08 np0005592767 podman[228204]: 2026-01-22 22:37:08.808944232 +0000 UTC m=+0.070468254 container create 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:08 np0005592767 podman[228204]: 2026-01-22 22:37:08.764950233 +0000 UTC m=+0.026474255 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.878 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.883 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:37:08 np0005592767 systemd[1]: Started libpod-conmon-099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee.scope.
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.897 182627 INFO nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Took 7.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.898 182627 DEBUG nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.903 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:37:08 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:37:08 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c322baf418e8a199deb9b7abc4b5a1c54499a4c3b9d0419fbbacc2422e2fcbc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:37:08 np0005592767 podman[228204]: 2026-01-22 22:37:08.936541248 +0000 UTC m=+0.198065250 container init 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:37:08 np0005592767 podman[228204]: 2026-01-22 22:37:08.943624415 +0000 UTC m=+0.205148397 container start 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:37:08 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [NOTICE]   (228223) : New worker (228225) forked
Jan 22 17:37:08 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [NOTICE]   (228223) : Loading success.
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.983 182627 INFO nova.compute.manager [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Took 8.40 seconds to build instance.#033[00m
Jan 22 17:37:08 np0005592767 nova_compute[182623]: 2026-01-22 22:37:08.998 182627 DEBUG oslo_concurrency.lockutils [None req-a1f00ced-4093-4c6c-b4b0-6987556e064b 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:09 np0005592767 nova_compute[182623]: 2026-01-22 22:37:09.066 182627 DEBUG nova.network.neutron [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updated VIF entry in instance network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:37:09 np0005592767 nova_compute[182623]: 2026-01-22 22:37:09.067 182627 DEBUG nova.network.neutron [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:09 np0005592767 nova_compute[182623]: 2026-01-22 22:37:09.082 182627 DEBUG oslo_concurrency.lockutils [req-7c476277-a08f-43ec-9c4e-162e2d856d30 req-8e28de71-63ba-4643-b701-fc24eef80e41 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:10 np0005592767 podman[228234]: 2026-01-22 22:37:10.134512177 +0000 UTC m=+0.056541918 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.414 182627 DEBUG nova.compute.manager [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.414 182627 DEBUG oslo_concurrency.lockutils [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.415 182627 DEBUG oslo_concurrency.lockutils [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.415 182627 DEBUG oslo_concurrency.lockutils [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.415 182627 DEBUG nova.compute.manager [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] No waiting events found dispatching network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:37:10 np0005592767 nova_compute[182623]: 2026-01-22 22:37:10.415 182627 WARNING nova.compute.manager [req-1e1413bf-a45e-46ee-9462-1e92631c9bdc req-82933992-5504-4f8a-9172-e41026303150 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received unexpected event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.137 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:11 np0005592767 NetworkManager[54973]: <info>  [1769121431.1378] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 22 17:37:11 np0005592767 NetworkManager[54973]: <info>  [1769121431.1392] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.231 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:11Z|00465|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.244 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.301 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121416.3001354, a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.301 182627 INFO nova.compute.manager [-] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.325 182627 DEBUG nova.compute.manager [None req-9336276c-47a6-4a62-ad62-201c632eac81 - - - - - -] [instance: a02a3beb-e22a-463e-b9ec-2d5f5dd0d8dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:11 np0005592767 nova_compute[182623]: 2026-01-22 22:37:11.664 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:12.110 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:12.112 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:12.113 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.663 182627 DEBUG nova.compute.manager [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.664 182627 DEBUG nova.compute.manager [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing instance network info cache due to event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.664 182627 DEBUG oslo_concurrency.lockutils [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.664 182627 DEBUG oslo_concurrency.lockutils [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:12 np0005592767 nova_compute[182623]: 2026-01-22 22:37:12.665 182627 DEBUG nova.network.neutron [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.466 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Creating tmpfile /var/lib/nova/instances/tmptlznbsdr to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.507 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.671 182627 DEBUG nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptlznbsdr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.888 182627 DEBUG nova.network.neutron [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updated VIF entry in instance network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.889 182627 DEBUG nova.network.neutron [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:14 np0005592767 nova_compute[182623]: 2026-01-22 22:37:14.916 182627 DEBUG oslo_concurrency.lockutils [req-7cbc2c80-465d-4d4a-8cbf-ed7f28fe8d1a req-c537b7f8-7be8-490f-80e7-9bf3791e604c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:15 np0005592767 nova_compute[182623]: 2026-01-22 22:37:15.860 182627 DEBUG nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptlznbsdr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ade065ae-eabe-4971-b544-a7ce3257b082',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 22 17:37:15 np0005592767 nova_compute[182623]: 2026-01-22 22:37:15.941 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:15 np0005592767 nova_compute[182623]: 2026-01-22 22:37:15.942 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquired lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:15 np0005592767 nova_compute[182623]: 2026-01-22 22:37:15.942 182627 DEBUG nova.network.neutron [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:37:16 np0005592767 nova_compute[182623]: 2026-01-22 22:37:16.666 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:16.713 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:37:16 np0005592767 nova_compute[182623]: 2026-01-22 22:37:16.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:16.715 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:37:17 np0005592767 nova_compute[182623]: 2026-01-22 22:37:17.281 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.235 182627 DEBUG nova.network.neutron [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updating instance_info_cache with network_info: [{"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.257 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Releasing lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.272 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptlznbsdr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ade065ae-eabe-4971-b544-a7ce3257b082',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.274 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Creating instance directory: /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.274 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Creating disk.info with the contents: {'/var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk': 'qcow2', '/var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.275 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.276 182627 DEBUG nova.objects.instance [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ade065ae-eabe-4971-b544-a7ce3257b082 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.323 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.419 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.420 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.421 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.431 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.524 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.525 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.572 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.573 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.574 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.666 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.667 182627 DEBUG nova.virt.disk.api [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Checking if we can resize image /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.668 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.732 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.734 182627 DEBUG nova.virt.disk.api [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Cannot resize image /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.734 182627 DEBUG nova.objects.instance [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lazy-loading 'migration_context' on Instance uuid ade065ae-eabe-4971-b544-a7ce3257b082 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.754 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.783 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.784 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config to /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:37:18 np0005592767 nova_compute[182623]: 2026-01-22 22:37:18.785 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.362 182627 DEBUG oslo_concurrency.processutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk.config /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.363 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.366 182627 DEBUG nova.virt.libvirt.vif [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:36:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1408156996',display_name='tempest-TestNetworkAdvancedServerOps-server-1408156996',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1408156996',id=118,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNoIJTepGdykgsu+Hc4G5LBrSdR7LL3T6ljbXPiudwogM2bHm7/qR6mXmRdgKMdLRQDUJKUhwlku+ugLG11cFT+AoRlNK6bfxjnoLa5FeCxy/FowMUH0kyjxh2b0vfl7Jg==',key_name='tempest-TestNetworkAdvancedServerOps-1971532883',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:36:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-exm9vyqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:36:51Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=ade065ae-eabe-4971-b544-a7ce3257b082,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.366 182627 DEBUG nova.network.os_vif_util [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converting VIF {"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.368 182627 DEBUG nova.network.os_vif_util [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.368 182627 DEBUG os_vif [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.369 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.370 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.371 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.375 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.376 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a753cac-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.377 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a753cac-42, col_values=(('external_ids', {'iface-id': '8a753cac-4253-415c-af0a-07dba97847f4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:78:d2', 'vm-uuid': 'ade065ae-eabe-4971-b544-a7ce3257b082'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.379 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:19 np0005592767 NetworkManager[54973]: <info>  [1769121439.3798] manager: (tap8a753cac-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.383 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.441 182627 INFO os_vif [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42')#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.442 182627 DEBUG nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 22 17:37:19 np0005592767 nova_compute[182623]: 2026-01-22 22:37:19.443 182627 DEBUG nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptlznbsdr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ade065ae-eabe-4971-b544-a7ce3257b082',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 22 17:37:20 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:20.718 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:20 np0005592767 nova_compute[182623]: 2026-01-22 22:37:20.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.082 182627 DEBUG nova.network.neutron [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Port 8a753cac-4253-415c-af0a-07dba97847f4 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.096 182627 DEBUG nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptlznbsdr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ade065ae-eabe-4971-b544-a7ce3257b082',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 22 17:37:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:21Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:50:26:52 10.100.0.11
Jan 22 17:37:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:21Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:50:26:52 10.100.0.11
Jan 22 17:37:21 np0005592767 podman[228299]: 2026-01-22 22:37:21.21849657 +0000 UTC m=+0.110701078 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 17:37:21 np0005592767 systemd[1]: Starting libvirt proxy daemon...
Jan 22 17:37:21 np0005592767 systemd[1]: Started libvirt proxy daemon.
Jan 22 17:37:21 np0005592767 kernel: tap8a753cac-42: entered promiscuous mode
Jan 22 17:37:21 np0005592767 NetworkManager[54973]: <info>  [1769121441.5287] manager: (tap8a753cac-42): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:21Z|00466|binding|INFO|Claiming lport 8a753cac-4253-415c-af0a-07dba97847f4 for this additional chassis.
Jan 22 17:37:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:21Z|00467|binding|INFO|8a753cac-4253-415c-af0a-07dba97847f4: Claiming fa:16:3e:f3:78:d2 10.100.0.13
Jan 22 17:37:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:21Z|00468|binding|INFO|Setting lport 8a753cac-4253-415c-af0a-07dba97847f4 ovn-installed in OVS
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:21 np0005592767 systemd-udevd[228353]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:37:21 np0005592767 systemd-machined[153912]: New machine qemu-61-instance-00000076.
Jan 22 17:37:21 np0005592767 systemd[1]: Started Virtual Machine qemu-61-instance-00000076.
Jan 22 17:37:21 np0005592767 NetworkManager[54973]: <info>  [1769121441.6200] device (tap8a753cac-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:37:21 np0005592767 NetworkManager[54973]: <info>  [1769121441.6209] device (tap8a753cac-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.669 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:21 np0005592767 nova_compute[182623]: 2026-01-22 22:37:21.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.137 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121442.1368172, ade065ae-eabe-4971-b544-a7ce3257b082 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.138 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] VM Started (Lifecycle Event)#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.158 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.997 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121442.996825, ade065ae-eabe-4971-b544-a7ce3257b082 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:22 np0005592767 nova_compute[182623]: 2026-01-22 22:37:22.997 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.017 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.021 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.039 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:37:23 np0005592767 nova_compute[182623]: 2026-01-22 22:37:23.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.211 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.212 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.212 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.213 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:24 np0005592767 nova_compute[182623]: 2026-01-22 22:37:24.380 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:24Z|00469|binding|INFO|Claiming lport 8a753cac-4253-415c-af0a-07dba97847f4 for this chassis.
Jan 22 17:37:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:24Z|00470|binding|INFO|8a753cac-4253-415c-af0a-07dba97847f4: Claiming fa:16:3e:f3:78:d2 10.100.0.13
Jan 22 17:37:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:24Z|00471|binding|INFO|Setting lport 8a753cac-4253-415c-af0a-07dba97847f4 up in Southbound
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.943 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:78:d2 10.100.0.13'], port_security=['fa:16:3e:f3:78:d2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ade065ae-eabe-4971-b544-a7ce3257b082', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dcdcacab-a0c3-4fbc-8105-2261115f68fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.216'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af971694-0394-4598-95f0-39043d271462, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=8a753cac-4253-415c-af0a-07dba97847f4) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.945 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 8a753cac-4253-415c-af0a-07dba97847f4 in datapath 493d7061-afab-4afb-8a74-384eb3cdb0f8 bound to our chassis#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.947 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 493d7061-afab-4afb-8a74-384eb3cdb0f8#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.964 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc5704dc-3422-4457-b8d0-7500dd3f7c22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.966 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap493d7061-a1 in ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.971 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap493d7061-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.972 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4e56950c-6f84-491f-a3cb-dbde9ecf820d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.973 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a183a777-6e0f-4210-9bf5-33c8165e5709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:24.991 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[188777b7-f18d-461e-b7cc-59ecc6b31f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.012 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[15a57903-1fb5-4064-a2a5-ec49416b917b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.054 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[bea903b0-9153-4d10-926b-4e74c40b793d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d476dd04-867e-413a-af65-f81dfe1ed3ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 NetworkManager[54973]: <info>  [1769121445.0627] manager: (tap493d7061-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/219)
Jan 22 17:37:25 np0005592767 systemd-udevd[228384]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.121 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[2630e3ce-301e-431b-8437-0d8188984e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.127 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd8d3be-0637-4e56-a47d-0330f3bf9325]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 NetworkManager[54973]: <info>  [1769121445.1665] device (tap493d7061-a0): carrier: link connected
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.247 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[856b7a67-cf9b-4136-ac84-d4733dd31a71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.267 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b67b43-a74e-4be1-8968-cf0fec95a751]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap493d7061-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:5c:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504174, 'reachable_time': 22871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228403, 'error': None, 'target': 'ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.281 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5b4d45-92b6-4329-a6f1-8784e09bbb71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:5c65'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504174, 'tstamp': 504174}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228404, 'error': None, 'target': 'ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.297 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f2de2a-690b-4009-89b7-869cfd17f093]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap493d7061-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:5c:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504174, 'reachable_time': 22871, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228405, 'error': None, 'target': 'ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.327 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ba297cbc-fbf2-4a01-97bb-72ea36fca351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.354 182627 INFO nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Post operation of migration started#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.395 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff97857-ebe1-4892-9cb2-13312c9d30a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.396 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap493d7061-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.397 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.397 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap493d7061-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.433 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:25 np0005592767 NetworkManager[54973]: <info>  [1769121445.4346] manager: (tap493d7061-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 22 17:37:25 np0005592767 kernel: tap493d7061-a0: entered promiscuous mode
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.437 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap493d7061-a0, col_values=(('external_ids', {'iface-id': '213590b9-d33a-4798-8a05-2718977a25c4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:25Z|00472|binding|INFO|Releasing lport 213590b9-d33a-4798-8a05-2718977a25c4 from this chassis (sb_readonly=0)
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.462 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.463 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/493d7061-afab-4afb-8a74-384eb3cdb0f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/493d7061-afab-4afb-8a74-384eb3cdb0f8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.464 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a10b9891-ba2a-4a4a-baae-62c32b4e8bdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.465 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-493d7061-afab-4afb-8a74-384eb3cdb0f8
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/493d7061-afab-4afb-8a74-384eb3cdb0f8.pid.haproxy
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 493d7061-afab-4afb-8a74-384eb3cdb0f8
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:37:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:25.466 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'env', 'PROCESS_TAG=haproxy-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/493d7061-afab-4afb-8a74-384eb3cdb0f8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.900 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.900 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquired lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:25 np0005592767 nova_compute[182623]: 2026-01-22 22:37:25.901 182627 DEBUG nova.network.neutron [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:37:26 np0005592767 podman[228438]: 2026-01-22 22:37:26.027155849 +0000 UTC m=+0.051552050 container create c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:37:26 np0005592767 systemd[1]: Started libpod-conmon-c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd.scope.
Jan 22 17:37:26 np0005592767 podman[228438]: 2026-01-22 22:37:26.002336591 +0000 UTC m=+0.026732802 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:37:26 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:37:26 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f39d06a443d59d3b2480d36a602fca55c825c3882698c684f009a22af96aa2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:37:26 np0005592767 podman[228438]: 2026-01-22 22:37:26.142455823 +0000 UTC m=+0.166852054 container init c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:37:26 np0005592767 podman[228438]: 2026-01-22 22:37:26.152035248 +0000 UTC m=+0.176431449 container start c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:37:26 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [NOTICE]   (228458) : New worker (228460) forked
Jan 22 17:37:26 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [NOTICE]   (228458) : Loading success.
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.672 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.923 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.936 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.937 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.937 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.937 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.937 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.938 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:26 np0005592767 nova_compute[182623]: 2026-01-22 22:37:26.960 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.034 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:27 np0005592767 podman[228471]: 2026-01-22 22:37:27.083109141 +0000 UTC m=+0.067686067 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.106 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.107 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:27 np0005592767 podman[228470]: 2026-01-22 22:37:27.143519555 +0000 UTC m=+0.133340886 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.176 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.189 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.255 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.257 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.352 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.543 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.544 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5312MB free_disk=73.13845443725586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.545 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.545 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.595 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration for instance ade065ae-eabe-4971-b544-a7ce3257b082 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.625 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updating resource usage from migration c95f9bbc-3b80-4796-9b16-8f27d4346a59#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.626 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Starting to track incoming migration c95f9bbc-3b80-4796-9b16-8f27d4346a59 with flavor 63b0d901-60c2-48cb-afeb-72a71e897d3d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.682 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 17a24497-f021-486d-8f08-892c79ea1d31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.708 182627 WARNING nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance ade065ae-eabe-4971-b544-a7ce3257b082 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.709 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.709 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.855 182627 DEBUG nova.network.neutron [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updating instance_info_cache with network_info: [{"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.864 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.883 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Releasing lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.886 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.924 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.924 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.928 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.928 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.928 182627 DEBUG oslo_concurrency.lockutils [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:27 np0005592767 nova_compute[182623]: 2026-01-22 22:37:27.933 182627 INFO nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 22 17:37:27 np0005592767 virtqemud[182095]: Domain id=61 name='instance-00000076' uuid=ade065ae-eabe-4971-b544-a7ce3257b082 is tainted: custom-monitor
Jan 22 17:37:28 np0005592767 nova_compute[182623]: 2026-01-22 22:37:28.421 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:28 np0005592767 nova_compute[182623]: 2026-01-22 22:37:28.939 182627 INFO nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.261 182627 DEBUG nova.compute.manager [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.328 182627 INFO nova.compute.manager [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] instance snapshotting#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.330 182627 DEBUG nova.objects.instance [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'flavor' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.382 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.666 182627 INFO nova.virt.libvirt.driver [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Beginning live snapshot process#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.883 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.884 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:29 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.916 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.945 182627 INFO nova.virt.libvirt.driver [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.950 182627 DEBUG nova.compute.manager [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.970 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.971 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:29 np0005592767 nova_compute[182623]: 2026-01-22 22:37:29.992 182627 DEBUG nova.objects.instance [None req-2dc82a07-3473-421c-baad-0f75cd318639 f7e2c572e7134b04a0b8b7e34234cc90 3846af94a423479bb03e96cd355920f0 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.035 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.053 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.109 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.111 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.150 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.151 182627 INFO nova.virt.libvirt.driver [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.199 182627 DEBUG nova.virt.libvirt.guest [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.704 182627 DEBUG nova.virt.libvirt.guest [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.709 182627 INFO nova.virt.libvirt.driver [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.768 182627 DEBUG nova.privsep.utils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.769 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819.delta /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:30 np0005592767 nova_compute[182623]: 2026-01-22 22:37:30.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:37:31 np0005592767 nova_compute[182623]: 2026-01-22 22:37:31.130 182627 DEBUG oslo_concurrency.processutils [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819.delta /var/lib/nova/instances/snapshots/tmpzl_1anpe/dca94e6e54454120ae3069f062ad2819" returned: 0 in 0.361s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:31 np0005592767 nova_compute[182623]: 2026-01-22 22:37:31.136 182627 INFO nova.virt.libvirt.driver [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:37:31 np0005592767 nova_compute[182623]: 2026-01-22 22:37:31.674 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:32Z|00473|binding|INFO|Releasing lport 213590b9-d33a-4798-8a05-2718977a25c4 from this chassis (sb_readonly=0)
Jan 22 17:37:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:32Z|00474|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:32 np0005592767 nova_compute[182623]: 2026-01-22 22:37:32.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:33 np0005592767 podman[228556]: 2026-01-22 22:37:33.151823921 +0000 UTC m=+0.056410355 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:33 np0005592767 podman[228557]: 2026-01-22 22:37:33.163229377 +0000 UTC m=+0.069851937 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:37:33 np0005592767 nova_compute[182623]: 2026-01-22 22:37:33.339 182627 INFO nova.compute.manager [None req-752d76df-ca89-4688-9481-7e7895e52d2e 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Get console output#033[00m
Jan 22 17:37:33 np0005592767 nova_compute[182623]: 2026-01-22 22:37:33.490 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:37:33 np0005592767 nova_compute[182623]: 2026-01-22 22:37:33.585 182627 INFO nova.virt.libvirt.driver [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot image upload complete#033[00m
Jan 22 17:37:33 np0005592767 nova_compute[182623]: 2026-01-22 22:37:33.586 182627 INFO nova.compute.manager [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Took 4.22 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:37:33 np0005592767 nova_compute[182623]: 2026-01-22 22:37:33.987 182627 DEBUG nova.compute.manager [None req-3f85056a-334d-467f-ba9c-b7912541e3e8 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.384 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.988 182627 DEBUG nova.compute.manager [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received event network-changed-8a753cac-4253-415c-af0a-07dba97847f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.989 182627 DEBUG nova.compute.manager [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Refreshing instance network info cache due to event network-changed-8a753cac-4253-415c-af0a-07dba97847f4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.989 182627 DEBUG oslo_concurrency.lockutils [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.990 182627 DEBUG oslo_concurrency.lockutils [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:37:34 np0005592767 nova_compute[182623]: 2026-01-22 22:37:34.991 182627 DEBUG nova.network.neutron [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Refreshing network info cache for port 8a753cac-4253-415c-af0a-07dba97847f4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.042 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "ade065ae-eabe-4971-b544-a7ce3257b082" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.043 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.044 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.044 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.045 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.061 182627 INFO nova.compute.manager [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Terminating instance#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.072 182627 DEBUG nova.compute.manager [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:37:35 np0005592767 kernel: tap8a753cac-42 (unregistering): left promiscuous mode
Jan 22 17:37:35 np0005592767 NetworkManager[54973]: <info>  [1769121455.1010] device (tap8a753cac-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:35Z|00475|binding|INFO|Releasing lport 8a753cac-4253-415c-af0a-07dba97847f4 from this chassis (sb_readonly=0)
Jan 22 17:37:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:35Z|00476|binding|INFO|Setting lport 8a753cac-4253-415c-af0a-07dba97847f4 down in Southbound
Jan 22 17:37:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:35Z|00477|binding|INFO|Removing iface tap8a753cac-42 ovn-installed in OVS
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.115 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.120 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:78:d2 10.100.0.13'], port_security=['fa:16:3e:f3:78:d2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ade065ae-eabe-4971-b544-a7ce3257b082', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'dcdcacab-a0c3-4fbc-8105-2261115f68fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af971694-0394-4598-95f0-39043d271462, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=8a753cac-4253-415c-af0a-07dba97847f4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.122 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 8a753cac-4253-415c-af0a-07dba97847f4 in datapath 493d7061-afab-4afb-8a74-384eb3cdb0f8 unbound from our chassis#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.125 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 493d7061-afab-4afb-8a74-384eb3cdb0f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.126 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0b4236-ad03-4d9c-a2c6-fbe7b6f67821]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.127 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8 namespace which is not needed anymore#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.132 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 22 17:37:35 np0005592767 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000076.scope: Consumed 1.934s CPU time.
Jan 22 17:37:35 np0005592767 systemd-machined[153912]: Machine qemu-61-instance-00000076 terminated.
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.297 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [NOTICE]   (228458) : haproxy version is 2.8.14-c23fe91
Jan 22 17:37:35 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [NOTICE]   (228458) : path to executable is /usr/sbin/haproxy
Jan 22 17:37:35 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [WARNING]  (228458) : Exiting Master process...
Jan 22 17:37:35 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [ALERT]    (228458) : Current worker (228460) exited with code 143 (Terminated)
Jan 22 17:37:35 np0005592767 neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8[228454]: [WARNING]  (228458) : All workers exited. Exiting... (0)
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.305 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 systemd[1]: libpod-c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd.scope: Deactivated successfully.
Jan 22 17:37:35 np0005592767 podman[228625]: 2026-01-22 22:37:35.313690249 +0000 UTC m=+0.057371410 container died c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:37:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd-userdata-shm.mount: Deactivated successfully.
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.342 182627 INFO nova.virt.libvirt.driver [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Instance destroyed successfully.#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.343 182627 DEBUG nova.objects.instance [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid ade065ae-eabe-4971-b544-a7ce3257b082 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay-46f39d06a443d59d3b2480d36a602fca55c825c3882698c684f009a22af96aa2-merged.mount: Deactivated successfully.
Jan 22 17:37:35 np0005592767 podman[228625]: 2026-01-22 22:37:35.353335388 +0000 UTC m=+0.097016469 container cleanup c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.359 182627 DEBUG nova.virt.libvirt.vif [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:36:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1408156996',display_name='tempest-TestNetworkAdvancedServerOps-server-1408156996',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1408156996',id=118,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNoIJTepGdykgsu+Hc4G5LBrSdR7LL3T6ljbXPiudwogM2bHm7/qR6mXmRdgKMdLRQDUJKUhwlku+ugLG11cFT+AoRlNK6bfxjnoLa5FeCxy/FowMUH0kyjxh2b0vfl7Jg==',key_name='tempest-TestNetworkAdvancedServerOps-1971532883',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:36:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-exm9vyqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:37:30Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=ade065ae-eabe-4971-b544-a7ce3257b082,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.359 182627 DEBUG nova.network.os_vif_util [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.360 182627 DEBUG nova.network.os_vif_util [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.360 182627 DEBUG os_vif [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.362 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.363 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a753cac-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.365 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.368 182627 INFO os_vif [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:78:d2,bridge_name='br-int',has_traffic_filtering=True,id=8a753cac-4253-415c-af0a-07dba97847f4,network=Network(493d7061-afab-4afb-8a74-384eb3cdb0f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a753cac-42')#033[00m
Jan 22 17:37:35 np0005592767 systemd[1]: libpod-conmon-c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd.scope: Deactivated successfully.
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.369 182627 INFO nova.virt.libvirt.driver [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Deleting instance files /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082_del#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.370 182627 INFO nova.virt.libvirt.driver [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Deletion of /var/lib/nova/instances/ade065ae-eabe-4971-b544-a7ce3257b082_del complete#033[00m
Jan 22 17:37:35 np0005592767 podman[228670]: 2026-01-22 22:37:35.419864652 +0000 UTC m=+0.041326786 container remove c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.424 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e9eead66-0525-40f9-a4be-94036707f4f4]: (4, ('Thu Jan 22 10:37:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8 (c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd)\nc657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd\nThu Jan 22 10:37:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8 (c657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd)\nc657b9f1b919f87aeca939b50a9ee6c006028ebecd12f9d07743cee10116eabd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.426 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e07f15-57ad-4266-80b3-35b86f0167e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.427 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap493d7061-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.428 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 kernel: tap493d7061-a0: left promiscuous mode
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.441 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.445 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[266b0434-efa9-4ecc-a7a4-d347d94100ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.457 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[762b0f08-9222-4827-9451-24475588ac86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.459 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c5806cc7-7862-4010-aca8-675e7ed7c88a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.461 182627 INFO nova.compute.manager [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.461 182627 DEBUG oslo.service.loopingcall [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.461 182627 DEBUG nova.compute.manager [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.462 182627 DEBUG nova.network.neutron [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.488 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[885c3ae3-61ab-426b-8515-6494a2cf13a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504162, 'reachable_time': 16060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228685, 'error': None, 'target': 'ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.492 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-493d7061-afab-4afb-8a74-384eb3cdb0f8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:37:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:37:35.492 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4654fa-866a-4ff8-b309-b996792f4e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.493 182627 DEBUG nova.compute.manager [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received event network-vif-unplugged-8a753cac-4253-415c-af0a-07dba97847f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.494 182627 DEBUG oslo_concurrency.lockutils [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:35 np0005592767 systemd[1]: run-netns-ovnmeta\x2d493d7061\x2dafab\x2d4afb\x2d8a74\x2d384eb3cdb0f8.mount: Deactivated successfully.
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.494 182627 DEBUG oslo_concurrency.lockutils [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.494 182627 DEBUG oslo_concurrency.lockutils [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.494 182627 DEBUG nova.compute.manager [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] No waiting events found dispatching network-vif-unplugged-8a753cac-4253-415c-af0a-07dba97847f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.495 182627 DEBUG nova.compute.manager [req-81df39be-9aeb-44aa-80ed-79e3a0baffe3 req-874b3365-9048-405a-9aec-aa12d630c846 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received event network-vif-unplugged-8a753cac-4253-415c-af0a-07dba97847f4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.762 182627 DEBUG nova.compute.manager [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.847 182627 INFO nova.compute.manager [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] instance snapshotting#033[00m
Jan 22 17:37:35 np0005592767 nova_compute[182623]: 2026-01-22 22:37:35.848 182627 DEBUG nova.objects.instance [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'flavor' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.112 182627 INFO nova.virt.libvirt.driver [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Beginning live snapshot process#033[00m
Jan 22 17:37:36 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.325 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.414 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.415 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.476 182627 DEBUG nova.network.neutron [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.478 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.491 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.511 182627 INFO nova.compute.manager [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Took 1.05 seconds to deallocate network for instance.#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.562 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.563 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.589 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.589 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.597 182627 DEBUG nova.compute.manager [req-47606641-5b67-4660-90ac-dd183e17614e req-9ef6ce78-35f4-469e-b2b6-a31e7a470021 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received event network-vif-deleted-8a753cac-4253-415c-af0a-07dba97847f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.599 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.605 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40.delta 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.606 182627 INFO nova.virt.libvirt.driver [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:36Z|00478|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.640 182627 INFO nova.scheduler.client.report [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance ade065ae-eabe-4971-b544-a7ce3257b082#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.651 182627 DEBUG nova.virt.libvirt.guest [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:36 np0005592767 nova_compute[182623]: 2026-01-22 22:37:36.855 182627 DEBUG oslo_concurrency.lockutils [None req-1d2b585f-de3a-4e45-82c5-86750f62df81 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.047 182627 DEBUG nova.network.neutron [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updated VIF entry in instance network info cache for port 8a753cac-4253-415c-af0a-07dba97847f4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.047 182627 DEBUG nova.network.neutron [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Updating instance_info_cache with network_info: [{"id": "8a753cac-4253-415c-af0a-07dba97847f4", "address": "fa:16:3e:f3:78:d2", "network": {"id": "493d7061-afab-4afb-8a74-384eb3cdb0f8", "bridge": "br-int", "label": "tempest-network-smoke--1073254463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a753cac-42", "ovs_interfaceid": "8a753cac-4253-415c-af0a-07dba97847f4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.069 182627 DEBUG oslo_concurrency.lockutils [req-74705996-b4c4-423a-9f63-92d6141a3636 req-2a663e40-6506-4ed1-9f4d-69e043572678 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-ade065ae-eabe-4971-b544-a7ce3257b082" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.155 182627 DEBUG nova.virt.libvirt.guest [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.157 182627 INFO nova.virt.libvirt.driver [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.204 182627 DEBUG nova.privsep.utils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.205 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40.delta /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.612 182627 DEBUG nova.compute.manager [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received event network-vif-plugged-8a753cac-4253-415c-af0a-07dba97847f4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.613 182627 DEBUG oslo_concurrency.lockutils [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.613 182627 DEBUG oslo_concurrency.lockutils [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.614 182627 DEBUG oslo_concurrency.lockutils [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ade065ae-eabe-4971-b544-a7ce3257b082-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.614 182627 DEBUG nova.compute.manager [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] No waiting events found dispatching network-vif-plugged-8a753cac-4253-415c-af0a-07dba97847f4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.615 182627 WARNING nova.compute.manager [req-d83fea02-3bc4-45e8-8a67-f16b209877cc req-adba183b-2e67-44d0-8c42-0d58436536ba 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Received unexpected event network-vif-plugged-8a753cac-4253-415c-af0a-07dba97847f4 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.626 182627 DEBUG oslo_concurrency.processutils [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40.delta /var/lib/nova/instances/snapshots/tmpco8afc4n/c856c2df339a47388fd41c2c488f5a40" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:37 np0005592767 nova_compute[182623]: 2026-01-22 22:37:37.637 182627 INFO nova.virt.libvirt.driver [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:37:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:39Z|00479|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:39 np0005592767 nova_compute[182623]: 2026-01-22 22:37:39.609 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:40 np0005592767 nova_compute[182623]: 2026-01-22 22:37:40.365 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:40 np0005592767 nova_compute[182623]: 2026-01-22 22:37:40.924 182627 INFO nova.virt.libvirt.driver [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot image upload complete#033[00m
Jan 22 17:37:40 np0005592767 nova_compute[182623]: 2026-01-22 22:37:40.925 182627 INFO nova.compute.manager [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Took 5.04 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:37:41 np0005592767 podman[228716]: 2026-01-22 22:37:41.140428441 +0000 UTC m=+0.060032135 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:37:41 np0005592767 nova_compute[182623]: 2026-01-22 22:37:41.450 182627 DEBUG nova.compute.manager [None req-7ebcd55b-7748-438b-9c25-5fe14fd1a448 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 22 17:37:41 np0005592767 nova_compute[182623]: 2026-01-22 22:37:41.686 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:42 np0005592767 nova_compute[182623]: 2026-01-22 22:37:42.297 182627 DEBUG nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:42 np0005592767 nova_compute[182623]: 2026-01-22 22:37:42.393 182627 INFO nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] instance snapshotting#033[00m
Jan 22 17:37:42 np0005592767 nova_compute[182623]: 2026-01-22 22:37:42.394 182627 DEBUG nova.objects.instance [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'flavor' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:37:42 np0005592767 nova_compute[182623]: 2026-01-22 22:37:42.695 182627 INFO nova.virt.libvirt.driver [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Beginning live snapshot process#033[00m
Jan 22 17:37:42 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:37:42 np0005592767 nova_compute[182623]: 2026-01-22 22:37:42.967 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.020 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.021 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.075 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.089 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.143 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.144 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.179 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20.delta 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.181 182627 INFO nova.virt.libvirt.driver [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.224 182627 DEBUG nova.virt.libvirt.guest [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.729 182627 DEBUG nova.virt.libvirt.guest [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.732 182627 INFO nova.virt.libvirt.driver [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.767 182627 DEBUG nova.privsep.utils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:37:43 np0005592767 nova_compute[182623]: 2026-01-22 22:37:43.768 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20.delta /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:37:44 np0005592767 nova_compute[182623]: 2026-01-22 22:37:44.188 182627 DEBUG oslo_concurrency.processutils [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20.delta /var/lib/nova/instances/snapshots/tmpxxza5z2l/985c471853674021a9033848e6043a20" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:37:44 np0005592767 nova_compute[182623]: 2026-01-22 22:37:44.199 182627 INFO nova.virt.libvirt.driver [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:37:45 np0005592767 nova_compute[182623]: 2026-01-22 22:37:45.369 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:45 np0005592767 nova_compute[182623]: 2026-01-22 22:37:45.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:46 np0005592767 nova_compute[182623]: 2026-01-22 22:37:46.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:46 np0005592767 nova_compute[182623]: 2026-01-22 22:37:46.688 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:46 np0005592767 nova_compute[182623]: 2026-01-22 22:37:46.792 182627 INFO nova.virt.libvirt.driver [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Snapshot image upload complete#033[00m
Jan 22 17:37:46 np0005592767 nova_compute[182623]: 2026-01-22 22:37:46.794 182627 INFO nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Took 4.37 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:37:47 np0005592767 nova_compute[182623]: 2026-01-22 22:37:47.115 182627 DEBUG nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 22 17:37:47 np0005592767 nova_compute[182623]: 2026-01-22 22:37:47.116 182627 DEBUG nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458#033[00m
Jan 22 17:37:47 np0005592767 nova_compute[182623]: 2026-01-22 22:37:47.116 182627 DEBUG nova.compute.manager [None req-b4a6987b-dc06-493c-aad7-084818ede8c5 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Deleting image e336a000-27bf-4c8a-9cc2-2b30d3bbf694 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463#033[00m
Jan 22 17:37:50 np0005592767 nova_compute[182623]: 2026-01-22 22:37:50.341 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121455.3403382, ade065ae-eabe-4971-b544-a7ce3257b082 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:37:50 np0005592767 nova_compute[182623]: 2026-01-22 22:37:50.342 182627 INFO nova.compute.manager [-] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:37:50 np0005592767 nova_compute[182623]: 2026-01-22 22:37:50.370 182627 DEBUG nova.compute.manager [None req-bc71b7a2-586f-4036-9f04-4f82797b69c9 - - - - - -] [instance: ade065ae-eabe-4971-b544-a7ce3257b082] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:37:50 np0005592767 nova_compute[182623]: 2026-01-22 22:37:50.372 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:50 np0005592767 nova_compute[182623]: 2026-01-22 22:37:50.396 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:51 np0005592767 nova_compute[182623]: 2026-01-22 22:37:51.024 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:51 np0005592767 nova_compute[182623]: 2026-01-22 22:37:51.690 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:52 np0005592767 podman[228771]: 2026-01-22 22:37:52.165728338 +0000 UTC m=+0.072575493 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 22 17:37:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:37:55Z|00480|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:37:55 np0005592767 nova_compute[182623]: 2026-01-22 22:37:55.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:55 np0005592767 nova_compute[182623]: 2026-01-22 22:37:55.374 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:56 np0005592767 nova_compute[182623]: 2026-01-22 22:37:56.693 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:37:58 np0005592767 podman[228792]: 2026-01-22 22:37:58.1486793 +0000 UTC m=+0.057223807 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Jan 22 17:37:58 np0005592767 podman[228791]: 2026-01-22 22:37:58.195568769 +0000 UTC m=+0.107258703 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:38:00 np0005592767 nova_compute[182623]: 2026-01-22 22:38:00.377 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.124 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.124 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.139 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.222 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.223 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.228 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.228 182627 INFO nova.compute.claims [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.358 182627 DEBUG nova.compute.provider_tree [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.371 182627 DEBUG nova.scheduler.client.report [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.393 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.393 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.452 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.471 182627 INFO nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.488 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.585 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.586 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.587 182627 INFO nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating image(s)#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.587 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.588 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.588 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.599 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.654 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.655 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.655 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.666 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.694 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.720 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.721 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.759 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.760 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.760 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.828 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.829 182627 DEBUG nova.virt.disk.api [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Checking if we can resize image /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.830 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.888 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.889 182627 DEBUG nova.virt.disk.api [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Cannot resize image /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.890 182627 DEBUG nova.objects.instance [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'migration_context' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.911 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.911 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Ensure instance console log exists: /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.912 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.912 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.912 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.913 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.917 182627 WARNING nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.920 182627 DEBUG nova.virt.libvirt.host [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.921 182627 DEBUG nova.virt.libvirt.host [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.923 182627 DEBUG nova.virt.libvirt.host [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.923 182627 DEBUG nova.virt.libvirt.host [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.924 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.925 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.925 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.925 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.926 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.926 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.926 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.926 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.926 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.927 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.927 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.927 182627 DEBUG nova.virt.hardware [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.930 182627 DEBUG nova.objects.instance [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'pci_devices' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.942 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <uuid>316c1ab4-daa9-4a02-949f-84f24baeff9e</uuid>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <name>instance-0000007d</name>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerShowV254Test-server-1301160352</nova:name>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:38:01</nova:creationTime>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:user uuid="c2adfcb38a48412f923cf60e59b6b2e0">tempest-ServerShowV254Test-8531099-project-member</nova:user>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:        <nova:project uuid="f0db1b4cb89c4543aa892221c8094022">tempest-ServerShowV254Test-8531099</nova:project>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="serial">316c1ab4-daa9-4a02-949f-84f24baeff9e</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="uuid">316c1ab4-daa9-4a02-949f-84f24baeff9e</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/console.log" append="off"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:38:01 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:38:01 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:38:01 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:38:01 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.977 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.977 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:38:01 np0005592767 nova_compute[182623]: 2026-01-22 22:38:01.977 182627 INFO nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Using config drive#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.145 182627 INFO nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating config drive at /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.149 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpehspgt87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.291 182627 DEBUG oslo_concurrency.processutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpehspgt87" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:02 np0005592767 systemd-machined[153912]: New machine qemu-62-instance-0000007d.
Jan 22 17:38:02 np0005592767 systemd[1]: Started Virtual Machine qemu-62-instance-0000007d.
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.884 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121482.883379, 316c1ab4-daa9-4a02-949f-84f24baeff9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.884 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.887 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.887 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.891 182627 INFO nova.virt.libvirt.driver [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance spawned successfully.#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.891 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.915 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.918 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.919 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.919 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.920 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.920 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.920 182627 DEBUG nova.virt.libvirt.driver [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.926 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.983 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.984 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121482.8857405, 316c1ab4-daa9-4a02-949f-84f24baeff9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:02 np0005592767 nova_compute[182623]: 2026-01-22 22:38:02.984 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] VM Started (Lifecycle Event)#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.004 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.009 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.014 182627 INFO nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Took 1.43 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.014 182627 DEBUG nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.025 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.084 182627 INFO nova.compute.manager [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Took 1.89 seconds to build instance.#033[00m
Jan 22 17:38:03 np0005592767 nova_compute[182623]: 2026-01-22 22:38:03.102 182627 DEBUG oslo_concurrency.lockutils [None req-68dd3789-ca96-4701-aab1-2c0a4ccd35f0 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:04 np0005592767 podman[228881]: 2026-01-22 22:38:04.134333947 +0000 UTC m=+0.054408499 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:38:04 np0005592767 podman[228882]: 2026-01-22 22:38:04.134720798 +0000 UTC m=+0.056430045 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:38:04 np0005592767 nova_compute[182623]: 2026-01-22 22:38:04.795 182627 INFO nova.compute.manager [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Rebuilding instance#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.067 182627 DEBUG nova.compute.manager [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.144 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'pci_requests' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.155 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'pci_devices' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.167 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'resources' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.175 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'migration_context' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.185 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.188 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:38:05 np0005592767 nova_compute[182623]: 2026-01-22 22:38:05.381 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:06 np0005592767 nova_compute[182623]: 2026-01-22 22:38:06.698 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.325 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '17a24497-f021-486d-8f08-892c79ea1d31', 'name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000077', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'abdd987d004046138277253df8658aca', 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'hostId': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.329 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'name': 'tempest-ServerShowV254Test-server-1301160352', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f0db1b4cb89c4543aa892221c8094022', 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'hostId': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.329 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.330 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>]
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.343 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.344 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.359 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.359 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606fdd57-6b15-4467-aa75-318254ce797d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.330632', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05ae7b62-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': 'cb663400ff0749497025e5b31d162c78f183fbd8f421ab9b43e0f6f55d89e49d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.330632', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05ae8c56-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': '75487dca7e162e78be687a7aa566ec0650e3e845ab27649dff3544949d2f05ae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.330632', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05b0c606-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': '27bc8f5aa91df335251c80431d1afb2ed388ab42e88fc1ecd0015d721f580eaf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.330632', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05b0d164-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': '0b06dc8b9b55c0fd158367077e5628b7564d6e3f57dc579724a627df797912ee'}]}, 'timestamp': '2026-01-22 22:38:07.359793', '_unique_id': '3ba11f4bb32d4c81ad7ceb3541590b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.363 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.366 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 17a24497-f021-486d-8f08-892c79ea1d31 / tap570f7f3c-2f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.366 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57e3b691-1b81-4ae1-a92b-51c2fc3ced76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.363586', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05b1f328-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '4469602c2d1e04eb3d83b5c973145ef38b5c5b995a4ac67f46ca4fd1c8c14f8e'}]}, 'timestamp': '2026-01-22 22:38:07.368664', '_unique_id': 'ac543a44bac94cf29953eddc60fd7322'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.370 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b15e4b2-5354-4916-8497-87ec00026038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.370452', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05b28086-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '2bb3153a57edca2f9a666d3502df766f3f2e7ff602220cbc18b1257f60a4dd8e'}]}, 'timestamp': '2026-01-22 22:38:07.370811', '_unique_id': '96547f6323e345b7a27badabc5060924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.372 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.398 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.399 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.437 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.438 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e7f745b-0cc9-468a-a3c0-a7142ab9ca44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.372691', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05b6df32-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '72acb0d8970c82db1b464fcc34ceb41b4add1b6f7a04cc83d0babef748a063f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.372691', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05b6e950-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': 'a049f09444bbf88a6b6fc9920fc41f7ba92faef155164fa4baad58a0de8732c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.372691', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05bcc8de-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '2d33473ff734bb92e649736667ce1e49c6ce6d7d4881e596a8f65e279d3088f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.372691', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05bcd586-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '81c9c19e7c1a63b8f21576b8ecf49a60660c3382e3e4b9a14f852e0b32747c68'}]}, 'timestamp': '2026-01-22 22:38:07.438495', '_unique_id': '7917b71b625643b5a413a5a8ac9bdb07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.440 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.440 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.440 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.440 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4f65527-17ae-410a-812d-bf0e60dcebc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.440218', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05bd22ac-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': 'c591c45ab970fc22a6b49318362a94d178be0ee12f5d4620b2170c9470964843'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.440218', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05bd2acc-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': '96ce372b26e40d74fb9ff2c26009be675e6cdecc6bb108252c5a0df0735787fa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.440218', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05bd356c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': 'a3e81cb29663870a3ac55d37d0dafc395b4c3a717a92b5ed7a8a039a638d3fe8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.440218', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05bd3ea4-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': '0ceb7174bcffcc60f3035feb57420ebe891ffac034ab6dcc2bfe637449071615'}]}, 'timestamp': '2026-01-22 22:38:07.441164', '_unique_id': 'c7ff5f0416964e1da79403bc22b1f955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.441 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.442 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.443 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9551c2ca-68d0-4249-98fc-8af07b9ae8b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.443305', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05bd9b56-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '7a7f9e0c66eb29b4760a70b1e1498c7f1b7482312d0d6535fe44f0d3bfd87d2a'}]}, 'timestamp': '2026-01-22 22:38:07.443555', '_unique_id': 'ad79a821d38a428bab11be7f75bcf6fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.bytes volume: 30755328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.444 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.445 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.445 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4698295-be04-4542-bc26-da60318e1865', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30755328, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.444696', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05bdd10c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '874440603b503a2ff33a5e0daa242c57cea9e6a4b161125a0c4592ee344edb36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.444696', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05bdd92c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '3b75fd1cb7cd85e42db1a72cfeb41a0cc75449b0db1d698954cceecf57e817fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.444696', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05bde16a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '3792a5cd9c41d008989a7b4d80da36213e9c5e23efaae6add487ba76322cea58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.444696', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05bde930-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '89678943b3983f819bd4ec56f15d01b651425ac07c28803247fc0e40f0af72da'}]}, 'timestamp': '2026-01-22 22:38:07.445559', '_unique_id': '1933d27e789c4188a4d47db41b4b72e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.447 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.447 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.448 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.448 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84c1aea9-8f89-404f-9217-339eefd9380a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.447546', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05be4268-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '9bda799c42966713d139c7fa64d48d94941740098584cea88d62322547aabc4c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.447546', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05be4d8a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '56b9073323a7a73804783a3db531cac3bdbec75098e8aa90b1a7874516dcd71b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.447546', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05be57b2-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '0c747c3d7549078de8d6c59ead055e84879b9182c8142f461cf2d7208bdf8434'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.447546', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05be628e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': 'f6a2bca9514021a7e7f742079d066637675c78fe027fa1323d6c357ef5eb805e'}]}, 'timestamp': '2026-01-22 22:38:07.448640', '_unique_id': '2e052e0c908245fba786be92c43bcaac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.450 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cd477fe-67ae-4e90-a55a-22fad961680a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.450266', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05beac8a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': 'f802ec3a4e058732c2edac4659d5c888762d860d1ac065a226ec9963f2868381'}]}, 'timestamp': '2026-01-22 22:38:07.450547', '_unique_id': 'f7e96f0216f643dbb60dce3d31a4c417'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.451 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.465 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/cpu volume: 11710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.489 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/cpu volume: 4410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fa5f1cc-42d0-4c26-97aa-3e111fb6ab5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11710000000, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'timestamp': '2026-01-22T22:38:07.452062', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '05c10d4a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.100528457, 'message_signature': '8aa7cdacdbcba4ee3ca2f568294faeb5713f2882500bfe0e3ac950326f2c6056'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4410000000, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'timestamp': '2026-01-22T22:38:07.452062', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '05c4a996-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.124133361, 'message_signature': 'e110ec249be2e8831b0a8806597825fcc6ba1b8cb8dc43a5850df8c1b0c9af5d'}]}, 'timestamp': '2026-01-22 22:38:07.489932', '_unique_id': '9dd327f0ba0e4016b01236b20a4cb7d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.491 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.492 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.492 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d0f509-4c70-4349-b2cc-10df9f74beae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.492133', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05c512d2-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '15e30c0bf80d5394dbdeca149fb2bfbd565080764d99c4c381b2e88415e29e68'}]}, 'timestamp': '2026-01-22 22:38:07.492605', '_unique_id': '33bffba50abc4ac6a6b0b603a27b24a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.493 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.495 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.requests volume: 1110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.495 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.496 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.496 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78c03800-804a-40ed-a8f8-0e42978a4ba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1110, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.495419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c592ca-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': 'ad156934045b7cbde7569bd33a7655b0ab7216b821ef9f30e99f558925d2dc52'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.495419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c5a148-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': 'e656ce277d471003d5e5e31738ec54ec0052b1a3c08c43f4e5a8b147311082e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.495419', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c5afee-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '354db31f23a69e40cc14ce0d74a6bf9ffb8d9682efda0160c1c67eee03adaece'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.495419', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c5bd18-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '88af5cfdd3112c3d1f10af12ca317e8bb992782be1b6834698a5283be45f46d7'}]}, 'timestamp': '2026-01-22 22:38:07.496896', '_unique_id': '10dca715583e4b728191b64f85aa504e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.499 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.499 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b597c466-69ed-441e-a3b1-d144c452178e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.499334', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05c62bd6-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '76691e6314996af5a3668faf7fcb92697337b33850d6a826bacefe4171274d44'}]}, 'timestamp': '2026-01-22 22:38:07.499733', '_unique_id': '43817be186724a71bdf8341a1c03edd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.500 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.501 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.502 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/memory.usage volume: 42.4765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.502 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.502 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 316c1ab4-daa9-4a02-949f-84f24baeff9e: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1e1c9b6-b5a2-4f00-9149-590cceb7facc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4765625, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'timestamp': '2026-01-22T22:38:07.502003', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '05c6959e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.100528457, 'message_signature': '54ab77e96cd25d310419ffcc2dd44f54bdb993c4e42ae2ad5b1fcee3ebee2014'}]}, 'timestamp': '2026-01-22 22:38:07.502822', '_unique_id': '95c0b6c114ea4f119252300fe75fcfc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.503 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.504 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cdb1b9f-de9d-4c3e-b5dd-028a01ead594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.504881', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05c70434-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '5f0d419af13faa5664335d923e034b8cf8500346f736760b34771c5d487734d4'}]}, 'timestamp': '2026-01-22 22:38:07.505292', '_unique_id': '94269742e7444ae3abc895a3f7182996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.506 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.507 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.507 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.508 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.508 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b705754a-f197-413b-a713-113e89c4ebdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.507533', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c76b90-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': 'e3c86bf7fcebd3119f1e2ea8a54947137673049906e2911d7e1283cca5054b9a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.507533', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c778e2-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.965510896, 'message_signature': '8a7e2fd5dd00b25739bf50f4a8cba2f50b95caa97f02c87acc2df1f9945a285d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.507533', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c78706-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': 'cb98d7b1c4961687cf0950196d0a03c741defe31146f2155998f907122f951ae'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.507533', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c793e0-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.97972601, 'message_signature': '5f4aff5f443e075faa8cc4a31bc2510b0423d7613282223195e07f4ef1dbf31f'}]}, 'timestamp': '2026-01-22 22:38:07.508925', '_unique_id': '14091428256944adb60c390908d438c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.509 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.511 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '871839eb-2e83-4bae-98b6-53d51e1334b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.511090', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05c7f79a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '006de02938330dfb60f33a4923bc81a9426585a49f04eca35fffbde279854a35'}]}, 'timestamp': '2026-01-22 22:38:07.511500', '_unique_id': '72f123a6b9fc49498da1ba153cd1518d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.513 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.513 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>]
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.514 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.514 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d9f9c51-6bb0-49dd-81c2-08ba83ac41a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.514140', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05c86f36-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '25d3f90bce25e0b41d4b3b87e4b79ecf05c4be7ba6dbb89754b86ccebc1c865c'}]}, 'timestamp': '2026-01-22 22:38:07.514564', '_unique_id': 'd0d76cc0b30b441ea4d6f5f03f715876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.515 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.516 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.latency volume: 4598693677 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.517 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.517 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.517 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48974fb3-1aa4-48f9-ac90-1c9e3271c756', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4598693677, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.516664', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c8d034-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': 'e51d342e8df69b5858093f11d6e0f31c142d77ffcd8c46b6b9b98a79471b1187'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.516664', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c8de76-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '7428322f0b81b14c746602c7c17bd47619045a438e1fbf2ceb641e6a303f272f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.516664', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c8ebc8-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '06fb983c70c535693254f0cf0f43f9ca01cb373e77ed8c29ec1825fbd0d81e43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.516664', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c8f88e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': '06a1a88e0322abb005a7f5b7c11bc79dd0603af77bfa690d6979bea2c722e586'}]}, 'timestamp': '2026-01-22 22:38:07.518056', '_unique_id': '3087b2dc526940d79b17510a7ae35b53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.518 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.520 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.520 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>]
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.521 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.latency volume: 184828760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.521 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/disk.device.read.latency volume: 31873606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.521 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.latency volume: 113257648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.522 12 DEBUG ceilometer.compute.pollsters [-] 316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.device.read.latency volume: 786802 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90292e53-7d56-4e6c-9f3d-b014211175c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 184828760, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-vda', 'timestamp': '2026-01-22T22:38:07.521177', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c98164-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': 'd8052aba069b850b212fb7476df03a3aa4dd9ec85baf856613468c3f21d976e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31873606, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': '17a24497-f021-486d-8f08-892c79ea1d31-sda', 'timestamp': '2026-01-22T22:38:07.521177', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'instance-00000077', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c98eca-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.007555621, 'message_signature': '653f291a9daedc2dfaad88c3fcc36fa30163ff2d76b434b494c704ca3f0170c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 113257648, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-vda', 'timestamp': '2026-01-22T22:38:07.521177', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '05c99bd6-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': 'b2c50657da8bcb8eaaf4c863725e2f7cfd216a8e0f678c1d21fea62ba2b49bda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 786802, 'user_id': 'c2adfcb38a48412f923cf60e59b6b2e0', 'user_name': None, 'project_id': 'f0db1b4cb89c4543aa892221c8094022', 'project_name': None, 'resource_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e-sda', 'timestamp': '2026-01-22T22:38:07.521177', 'resource_metadata': {'display_name': 'tempest-ServerShowV254Test-server-1301160352', 'name': 'instance-0000007d', 'instance_id': '316c1ab4-daa9-4a02-949f-84f24baeff9e', 'instance_type': 'm1.nano', 'host': '4c29f01a738dff1301431deac35205c6f91481b34fad2118b96964eb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '05c9adc4-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5084.034510928, 'message_signature': 'b333e91279c6c70e377aa31806400e80a5e2c8b525e0d5bb37bb0a8afbf0695c'}]}, 'timestamp': '2026-01-22 22:38:07.522744', '_unique_id': 'd12bac20effd46a4be4c8de008dccd21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.523 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.525 12 DEBUG ceilometer.compute.pollsters [-] 17a24497-f021-486d-8f08-892c79ea1d31/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc59d4d-09fd-4606-8b30-9c787a03559a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_name': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_name': None, 'resource_id': 'instance-00000077-17a24497-f021-486d-8f08-892c79ea1d31-tap570f7f3c-2f', 'timestamp': '2026-01-22T22:38:07.525441', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1990766256', 'name': 'tap570f7f3c-2f', 'instance_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'instance_type': 'm1.nano', 'host': '5c69c04e757a6908e5809b106de973abfbe4e0eac6a83f5fd79bc165', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:50:26:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap570f7f3c-2f'}, 'message_id': '05ca29f2-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5083.998459099, 'message_signature': '6d78717197bcd36314c2ac405b420abab881fe92fe39c9d0fcd5cffe30b3ddbb'}]}, 'timestamp': '2026-01-22 22:38:07.525963', '_unique_id': '4beb17e561da4866bb6ae83d015593dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.526 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.528 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:38:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:38:07.528 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1990766256>, <NovaLikeServer: tempest-ServerShowV254Test-server-1301160352>]
Jan 22 17:38:08 np0005592767 nova_compute[182623]: 2026-01-22 22:38:08.322 182627 DEBUG nova.compute.manager [None req-55037731-ad11-467f-9ad2-22a394dfb44d 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Jan 22 17:38:10 np0005592767 nova_compute[182623]: 2026-01-22 22:38:10.385 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:11 np0005592767 nova_compute[182623]: 2026-01-22 22:38:11.702 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:11 np0005592767 nova_compute[182623]: 2026-01-22 22:38:11.998 182627 DEBUG oslo_concurrency.lockutils [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:11.999 182627 DEBUG oslo_concurrency.lockutils [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:12.000 182627 DEBUG nova.compute.manager [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:12.002 182627 DEBUG nova.compute.manager [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:12.003 182627 DEBUG nova.objects.instance [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'flavor' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:12.028 182627 DEBUG nova.objects.instance [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'info_cache' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:12.111 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:12.112 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:12.114 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:12 np0005592767 podman[228927]: 2026-01-22 22:38:12.174308484 +0000 UTC m=+0.082423475 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:38:12 np0005592767 nova_compute[182623]: 2026-01-22 22:38:12.293 182627 DEBUG nova.virt.libvirt.driver [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:38:14 np0005592767 kernel: tap570f7f3c-2f (unregistering): left promiscuous mode
Jan 22 17:38:14 np0005592767 NetworkManager[54973]: <info>  [1769121494.5632] device (tap570f7f3c-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:38:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:38:14Z|00481|binding|INFO|Releasing lport 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 from this chassis (sb_readonly=0)
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.573 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:38:14Z|00482|binding|INFO|Setting lport 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 down in Southbound
Jan 22 17:38:14 np0005592767 ovn_controller[94769]: 2026-01-22T22:38:14Z|00483|binding|INFO|Removing iface tap570f7f3c-2f ovn-installed in OVS
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.580 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.590 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:52 10.100.0.11'], port_security=['fa:16:3e:50:26:52 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17a24497-f021-486d-8f08-892c79ea1d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b9a45c4-3bd4-4f5f-b26b-5b1ab95bdd58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=570f7f3c-2f2a-4f46-87ea-784d78f09cc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.592 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 unbound from our chassis#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.594 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 84d8b010-d968-4df4-bedf-0c350ae42113, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.596 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[60f4ebe0-9803-4d43-9646-6d3b80b76cdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.597 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace which is not needed anymore#033[00m
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.603 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 22 17:38:14 np0005592767 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000077.scope: Consumed 15.680s CPU time.
Jan 22 17:38:14 np0005592767 systemd-machined[153912]: Machine qemu-60-instance-00000077 terminated.
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [NOTICE]   (228223) : haproxy version is 2.8.14-c23fe91
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [NOTICE]   (228223) : path to executable is /usr/sbin/haproxy
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [WARNING]  (228223) : Exiting Master process...
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [WARNING]  (228223) : Exiting Master process...
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [ALERT]    (228223) : Current worker (228225) exited with code 143 (Terminated)
Jan 22 17:38:14 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[228219]: [WARNING]  (228223) : All workers exited. Exiting... (0)
Jan 22 17:38:14 np0005592767 systemd[1]: libpod-099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee.scope: Deactivated successfully.
Jan 22 17:38:14 np0005592767 podman[228995]: 2026-01-22 22:38:14.758483667 +0000 UTC m=+0.062674768 container died 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 17:38:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee-userdata-shm.mount: Deactivated successfully.
Jan 22 17:38:14 np0005592767 systemd[1]: var-lib-containers-storage-overlay-c322baf418e8a199deb9b7abc4b5a1c54499a4c3b9d0419fbbacc2422e2fcbc8-merged.mount: Deactivated successfully.
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.811 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 podman[228995]: 2026-01-22 22:38:14.825548366 +0000 UTC m=+0.129739437 container cleanup 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:38:14 np0005592767 systemd[1]: libpod-conmon-099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee.scope: Deactivated successfully.
Jan 22 17:38:14 np0005592767 podman[229039]: 2026-01-22 22:38:14.887022989 +0000 UTC m=+0.038240680 container remove 099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.895 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18a3e86e-d120-44aa-a024-df188c40d913]: (4, ('Thu Jan 22 10:38:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee)\n099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee\nThu Jan 22 10:38:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee)\n099222ac50db9cfb937f167ec72f38aba561a88e958caa20da4760f960a263ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.896 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4b172834-6897-45c1-869b-6bc9a4340442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.897 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.899 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 kernel: tap84d8b010-d0: left promiscuous mode
Jan 22 17:38:14 np0005592767 nova_compute[182623]: 2026-01-22 22:38:14.915 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.919 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc11330-f226-42bc-9c75-d72f246722c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.932 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee1521b-16f8-401b-9dc3-27a28e662fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.934 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2b7290-5b0a-4848-be92-43e875cfaf64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.950 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c6235496-d644-4268-b8fe-02f0d7fb5c59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502468, 'reachable_time': 43636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229059, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.954 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:38:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:14.954 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a63be51e-3a27-4ce8-a95d-913eac772a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:38:14 np0005592767 systemd[1]: run-netns-ovnmeta\x2d84d8b010\x2dd968\x2d4df4\x2dbedf\x2d0c350ae42113.mount: Deactivated successfully.
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.042 182627 DEBUG nova.compute.manager [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-vif-unplugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.043 182627 DEBUG oslo_concurrency.lockutils [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.044 182627 DEBUG oslo_concurrency.lockutils [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.044 182627 DEBUG oslo_concurrency.lockutils [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.044 182627 DEBUG nova.compute.manager [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] No waiting events found dispatching network-vif-unplugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.045 182627 WARNING nova.compute.manager [req-958b217a-baf1-43fc-a933-c538d1fdf238 req-e68d4751-3db4-42b2-a564-293fbf9fdd3c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received unexpected event network-vif-unplugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for instance with vm_state active and task_state powering-off.#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.310 182627 INFO nova.virt.libvirt.driver [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance shutdown successfully after 3 seconds.#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.318 182627 INFO nova.virt.libvirt.driver [-] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance destroyed successfully.#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.318 182627 DEBUG nova.objects.instance [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'numa_topology' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.340 182627 DEBUG nova.compute.manager [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.354 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.387 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:15 np0005592767 nova_compute[182623]: 2026-01-22 22:38:15.444 182627 DEBUG oslo_concurrency.lockutils [None req-d955868a-f6cb-4ec8-b383-068f315eacbb 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:16 np0005592767 nova_compute[182623]: 2026-01-22 22:38:16.703 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:17 np0005592767 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 22 17:38:17 np0005592767 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007d.scope: Consumed 12.258s CPU time.
Jan 22 17:38:17 np0005592767 systemd-machined[153912]: Machine qemu-62-instance-0000007d terminated.
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.814 182627 DEBUG nova.compute.manager [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.815 182627 DEBUG oslo_concurrency.lockutils [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.815 182627 DEBUG oslo_concurrency.lockutils [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.816 182627 DEBUG oslo_concurrency.lockutils [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.816 182627 DEBUG nova.compute.manager [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] No waiting events found dispatching network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:38:17 np0005592767 nova_compute[182623]: 2026-01-22 22:38:17.816 182627 WARNING nova.compute.manager [req-2e5cc1b6-5a6e-4049-a919-6e300cb59b04 req-3992793b-5838-4130-a520-2f5886686fc1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received unexpected event network-vif-plugged-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for instance with vm_state stopped and task_state None.#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.368 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.373 182627 INFO nova.virt.libvirt.driver [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance destroyed successfully.#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.378 182627 INFO nova.virt.libvirt.driver [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance destroyed successfully.#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.378 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Deleting instance files /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e_del#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.379 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Deletion of /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e_del complete#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.663 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.664 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating image(s)#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.664 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.665 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.665 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.678 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.775 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.776 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.777 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.791 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.852 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.854 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.888 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.889 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.890 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.943 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.944 182627 DEBUG nova.virt.disk.api [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Checking if we can resize image /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:38:18 np0005592767 nova_compute[182623]: 2026-01-22 22:38:18.944 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.034 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.035 182627 DEBUG nova.virt.disk.api [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Cannot resize image /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.036 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.037 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Ensure instance console log exists: /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.038 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.039 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.039 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.042 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.049 182627 WARNING nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.060 182627 DEBUG nova.virt.libvirt.host [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.062 182627 DEBUG nova.virt.libvirt.host [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.066 182627 DEBUG nova.virt.libvirt.host [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.067 182627 DEBUG nova.virt.libvirt.host [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.069 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.070 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.071 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.071 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.072 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.073 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.073 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.074 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.074 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.075 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.075 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.076 182627 DEBUG nova.virt.hardware [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.077 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:19 np0005592767 nova_compute[182623]: 2026-01-22 22:38:19.104 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <uuid>316c1ab4-daa9-4a02-949f-84f24baeff9e</uuid>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <name>instance-0000007d</name>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerShowV254Test-server-1301160352</nova:name>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:38:19</nova:creationTime>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:user uuid="c2adfcb38a48412f923cf60e59b6b2e0">tempest-ServerShowV254Test-8531099-project-member</nova:user>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:        <nova:project uuid="f0db1b4cb89c4543aa892221c8094022">tempest-ServerShowV254Test-8531099</nova:project>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <nova:ports/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="serial">316c1ab4-daa9-4a02-949f-84f24baeff9e</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="uuid">316c1ab4-daa9-4a02-949f-84f24baeff9e</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/console.log" append="off"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:38:19 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:38:19 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:38:19 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:38:19 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:38:20 np0005592767 nova_compute[182623]: 2026-01-22 22:38:20.389 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:20 np0005592767 nova_compute[182623]: 2026-01-22 22:38:20.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:21 np0005592767 nova_compute[182623]: 2026-01-22 22:38:21.707 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:21 np0005592767 nova_compute[182623]: 2026-01-22 22:38:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.168 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.169 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.170 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Using config drive#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.204 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.773 182627 INFO nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Creating config drive at /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.787 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3or_m29t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:22 np0005592767 nova_compute[182623]: 2026-01-22 22:38:22.922 182627 DEBUG oslo_concurrency.processutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3or_m29t" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:23 np0005592767 systemd-machined[153912]: New machine qemu-63-instance-0000007d.
Jan 22 17:38:23 np0005592767 systemd[1]: Started Virtual Machine qemu-63-instance-0000007d.
Jan 22 17:38:23 np0005592767 podman[229095]: 2026-01-22 22:38:23.120687964 +0000 UTC m=+0.097615716 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.502 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 316c1ab4-daa9-4a02-949f-84f24baeff9e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.503 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121503.5008085, 316c1ab4-daa9-4a02-949f-84f24baeff9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.504 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.508 182627 DEBUG nova.compute.manager [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.508 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.513 182627 INFO nova.virt.libvirt.driver [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance spawned successfully.#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.513 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.534 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.542 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.543 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.545 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.546 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.546 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.547 182627 DEBUG nova.virt.libvirt.driver [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.555 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.582 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.583 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121503.5031545, 316c1ab4-daa9-4a02-949f-84f24baeff9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.583 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] VM Started (Lifecycle Event)#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.604 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.610 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.667 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.672 182627 DEBUG nova.compute.manager [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.812 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.813 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.814 182627 DEBUG nova.objects.instance [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.936 182627 DEBUG oslo_concurrency.lockutils [None req-c2ea0870-22d5-4961-a468-baa2c02b4915 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.957 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.958 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.959 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:38:23 np0005592767 nova_compute[182623]: 2026-01-22 22:38:23.959 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:25 np0005592767 nova_compute[182623]: 2026-01-22 22:38:25.391 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:26 np0005592767 nova_compute[182623]: 2026-01-22 22:38:26.708 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.169 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.169 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.170 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "316c1ab4-daa9-4a02-949f-84f24baeff9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.170 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.170 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.181 182627 INFO nova.compute.manager [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Terminating instance#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.191 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "refresh_cache-316c1ab4-daa9-4a02-949f-84f24baeff9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.192 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquired lock "refresh_cache-316c1ab4-daa9-4a02-949f-84f24baeff9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.192 182627 DEBUG nova.network.neutron [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.561 182627 DEBUG nova.network.neutron [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.629 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.964 182627 DEBUG nova.network.neutron [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.980 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Releasing lock "refresh_cache-316c1ab4-daa9-4a02-949f-84f24baeff9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:38:28 np0005592767 nova_compute[182623]: 2026-01-22 22:38:28.981 182627 DEBUG nova.compute.manager [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:38:29 np0005592767 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 22 17:38:29 np0005592767 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007d.scope: Consumed 6.035s CPU time.
Jan 22 17:38:29 np0005592767 systemd-machined[153912]: Machine qemu-63-instance-0000007d terminated.
Jan 22 17:38:29 np0005592767 podman[229134]: 2026-01-22 22:38:29.176966667 +0000 UTC m=+0.128890073 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, release=1755695350)
Jan 22 17:38:29 np0005592767 podman[229133]: 2026-01-22 22:38:29.180038132 +0000 UTC m=+0.135311091 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.236 182627 INFO nova.virt.libvirt.driver [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance destroyed successfully.#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.237 182627 DEBUG nova.objects.instance [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lazy-loading 'resources' on Instance uuid 316c1ab4-daa9-4a02-949f-84f24baeff9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.259 182627 INFO nova.virt.libvirt.driver [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Deleting instance files /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e_del#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.261 182627 INFO nova.virt.libvirt.driver [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Deletion of /var/lib/nova/instances/316c1ab4-daa9-4a02-949f-84f24baeff9e_del complete#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.339 182627 INFO nova.compute.manager [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.340 182627 DEBUG oslo.service.loopingcall [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.341 182627 DEBUG nova.compute.manager [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.341 182627 DEBUG nova.network.neutron [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.346 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.369 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.370 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.371 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.372 182627 DEBUG nova.network.neutron [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.374 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.376 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.377 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.377 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.377 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.407 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.408 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.409 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.409 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.490 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.529 182627 DEBUG nova.network.neutron [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.547 182627 DEBUG nova.network.neutron [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.565 182627 INFO nova.compute.manager [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Took 0.22 seconds to deallocate network for instance.#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.592 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.593 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.648 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.650 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.679 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.830 182627 DEBUG nova.compute.provider_tree [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.845 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121494.844032, 17a24497-f021-486d-8f08-892c79ea1d31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.845 182627 INFO nova.compute.manager [-] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.860 182627 DEBUG nova.scheduler.client.report [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.873 182627 DEBUG nova.compute.manager [None req-5c5b3520-47ca-4a7a-acab-efd842bdf3eb - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.878 182627 DEBUG nova.compute.manager [None req-5c5b3520-47ca-4a7a-acab-efd842bdf3eb - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: resize_prep, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.883 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.904 182627 INFO nova.compute.manager [None req-5c5b3520-47ca-4a7a-acab-efd842bdf3eb - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] During sync_power_state the instance has a pending task (resize_prep). Skip.#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.922 182627 INFO nova.scheduler.client.report [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Deleted allocations for instance 316c1ab4-daa9-4a02-949f-84f24baeff9e#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.977 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.978 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5693MB free_disk=73.15967178344727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.978 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:29 np0005592767 nova_compute[182623]: 2026-01-22 22:38:29.979 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.012 182627 DEBUG oslo_concurrency.lockutils [None req-f278ee5b-c4bc-4652-8fb3-b574f13bfb39 c2adfcb38a48412f923cf60e59b6b2e0 f0db1b4cb89c4543aa892221c8094022 - - default default] Lock "316c1ab4-daa9-4a02-949f-84f24baeff9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.038 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating resource usage from migration eea929a9-fc96-477e-a79c-bda264da04db#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.059 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Migration eea929a9-fc96-477e-a79c-bda264da04db is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.059 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.059 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.102 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.113 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.134 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.134 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.395 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.746 182627 DEBUG nova.network.neutron [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.766 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.913 182627 DEBUG nova.virt.libvirt.driver [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.914 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Creating file /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/20e48bec480e46b9a0a8ed2558debac3.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 22 17:38:30 np0005592767 nova_compute[182623]: 2026-01-22 22:38:30.914 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/20e48bec480e46b9a0a8ed2558debac3.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.378 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/20e48bec480e46b9a0a8ed2558debac3.tmp" returned: 1 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.380 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/20e48bec480e46b9a0a8ed2558debac3.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.381 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Creating directory /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.381 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:31.484 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:38:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:31.486 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.486 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.615 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.620 182627 INFO nova.virt.libvirt.driver [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance already shutdown.#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.629 182627 INFO nova.virt.libvirt.driver [-] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Instance destroyed successfully.#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.631 182627 DEBUG nova.virt.libvirt.vif [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:36:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1990766256',display_name='tempest-ServerActionsTestOtherB-server-1990766256',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1990766256',id=119,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:37:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-rgsieo8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:38:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=17a24497-f021-486d-8f08-892c79ea1d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-212091580-network", "vif_mac": "fa:16:3e:50:26:52"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.632 182627 DEBUG nova.network.os_vif_util [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-212091580-network", "vif_mac": "fa:16:3e:50:26:52"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.634 182627 DEBUG nova.network.os_vif_util [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.634 182627 DEBUG os_vif [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.636 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.637 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap570f7f3c-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.639 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.644 182627 INFO os_vif [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f')#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.648 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.665 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.666 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.703 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.705 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.722 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.759 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.761 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Copying file /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk to 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:38:31 np0005592767 nova_compute[182623]: 2026-01-22 22:38:31.761 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.391 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "scp -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk" returned: 0 in 0.630s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.392 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Copying file /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.392 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.config 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:38:32.488 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.708 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "scp -C -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.config 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.config" returned: 0 in 0.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.709 182627 DEBUG nova.virt.libvirt.volume.remotefs [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Copying file /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.710 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.info 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:38:32 np0005592767 nova_compute[182623]: 2026-01-22 22:38:32.987 182627 DEBUG oslo_concurrency.processutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "scp -C -r /var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31_resize/disk.info 192.168.122.101:/var/lib/nova/instances/17a24497-f021-486d-8f08-892c79ea1d31/disk.info" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:38:33 np0005592767 nova_compute[182623]: 2026-01-22 22:38:33.237 182627 DEBUG neutronclient.v2_0.client [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 22 17:38:33 np0005592767 nova_compute[182623]: 2026-01-22 22:38:33.367 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:33 np0005592767 nova_compute[182623]: 2026-01-22 22:38:33.367 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:33 np0005592767 nova_compute[182623]: 2026-01-22 22:38:33.368 182627 DEBUG oslo_concurrency.lockutils [None req-e42cf864-e0d4-4859-a533-71cbe8467932 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:35 np0005592767 podman[229211]: 2026-01-22 22:38:35.173672489 +0000 UTC m=+0.089109601 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:38:35 np0005592767 podman[229210]: 2026-01-22 22:38:35.181093844 +0000 UTC m=+0.095402375 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 22 17:38:35 np0005592767 nova_compute[182623]: 2026-01-22 22:38:35.630 182627 DEBUG nova.compute.manager [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Received event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:38:35 np0005592767 nova_compute[182623]: 2026-01-22 22:38:35.630 182627 DEBUG nova.compute.manager [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing instance network info cache due to event network-changed-570f7f3c-2f2a-4f46-87ea-784d78f09cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:38:35 np0005592767 nova_compute[182623]: 2026-01-22 22:38:35.631 182627 DEBUG oslo_concurrency.lockutils [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:38:35 np0005592767 nova_compute[182623]: 2026-01-22 22:38:35.631 182627 DEBUG oslo_concurrency.lockutils [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:38:35 np0005592767 nova_compute[182623]: 2026-01-22 22:38:35.631 182627 DEBUG nova.network.neutron [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Refreshing network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:38:36 np0005592767 nova_compute[182623]: 2026-01-22 22:38:36.639 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:36 np0005592767 nova_compute[182623]: 2026-01-22 22:38:36.712 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:38 np0005592767 nova_compute[182623]: 2026-01-22 22:38:38.863 182627 DEBUG nova.network.neutron [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updated VIF entry in instance network info cache for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:38:38 np0005592767 nova_compute[182623]: 2026-01-22 22:38:38.864 182627 DEBUG nova.network.neutron [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:38 np0005592767 nova_compute[182623]: 2026-01-22 22:38:38.884 182627 DEBUG oslo_concurrency.lockutils [req-dd021e7b-ab8d-47b2-93be-627bbd68d05e req-79a48165-9553-45a7-882c-9ef946a624db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.101 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "17a24497-f021-486d-8f08-892c79ea1d31" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.102 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.103 182627 DEBUG nova.compute.manager [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Going to confirm migration 19 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.143 182627 DEBUG nova.objects.instance [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'info_cache' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.572 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.591 182627 DEBUG neutronclient.v2_0.client [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 570f7f3c-2f2a-4f46-87ea-784d78f09cc1 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.592 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.592 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:38:40 np0005592767 nova_compute[182623]: 2026-01-22 22:38:40.593 182627 DEBUG nova.network.neutron [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:38:41 np0005592767 nova_compute[182623]: 2026-01-22 22:38:41.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:41 np0005592767 nova_compute[182623]: 2026-01-22 22:38:41.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.275 182627 DEBUG nova.network.neutron [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Updating instance_info_cache with network_info: [{"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.297 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-17a24497-f021-486d-8f08-892c79ea1d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.298 182627 DEBUG nova.objects.instance [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'migration_context' on Instance uuid 17a24497-f021-486d-8f08-892c79ea1d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.331 182627 DEBUG nova.virt.libvirt.vif [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:36:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1990766256',display_name='tempest-ServerActionsTestOtherB-server-1990766256',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1990766256',id=119,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFkqkcnPVXoYCjIgI5bw5OA6/5nym1rzZZURq62sf76ZpC5y2dgqZ39wG5JFuphS0Mujaf51N2ioOXSv8BTIWm028Sgb05TqNV6DDbykFp1jT1uEcdV7QMSeYi3Dtxoog==',key_name='tempest-keypair-483211252',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:38:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-rgsieo8l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:38:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8b15fdf3e23640a2b9579790941bb346',uuid=17a24497-f021-486d-8f08-892c79ea1d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.332 182627 DEBUG nova.network.os_vif_util [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "address": "fa:16:3e:50:26:52", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap570f7f3c-2f", "ovs_interfaceid": "570f7f3c-2f2a-4f46-87ea-784d78f09cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.333 182627 DEBUG nova.network.os_vif_util [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.333 182627 DEBUG os_vif [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.335 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.335 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap570f7f3c-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.335 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.338 182627 INFO os_vif [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:52,bridge_name='br-int',has_traffic_filtering=True,id=570f7f3c-2f2a-4f46-87ea-784d78f09cc1,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap570f7f3c-2f')#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.338 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.338 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.468 182627 DEBUG nova.compute.provider_tree [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.487 182627 DEBUG nova.scheduler.client.report [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.535 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.535 182627 DEBUG nova.compute.manager [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: 17a24497-f021-486d-8f08-892c79ea1d31] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.666 182627 INFO nova.scheduler.client.report [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Deleted allocation for migration eea929a9-fc96-477e-a79c-bda264da04db#033[00m
Jan 22 17:38:42 np0005592767 nova_compute[182623]: 2026-01-22 22:38:42.761 182627 DEBUG oslo_concurrency.lockutils [None req-cdc4d2bf-74a5-4317-b242-a7e03515ddf6 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "17a24497-f021-486d-8f08-892c79ea1d31" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:38:43 np0005592767 podman[229253]: 2026-01-22 22:38:43.129901705 +0000 UTC m=+0.051031795 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:38:44 np0005592767 nova_compute[182623]: 2026-01-22 22:38:44.234 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121509.2328691, 316c1ab4-daa9-4a02-949f-84f24baeff9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:38:44 np0005592767 nova_compute[182623]: 2026-01-22 22:38:44.235 182627 INFO nova.compute.manager [-] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:38:44 np0005592767 nova_compute[182623]: 2026-01-22 22:38:44.262 182627 DEBUG nova.compute.manager [None req-7b1ae898-6889-4c37-aa34-2e96d0af016d - - - - - -] [instance: 316c1ab4-daa9-4a02-949f-84f24baeff9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:38:46 np0005592767 nova_compute[182623]: 2026-01-22 22:38:46.646 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:46 np0005592767 nova_compute[182623]: 2026-01-22 22:38:46.716 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:51 np0005592767 nova_compute[182623]: 2026-01-22 22:38:51.650 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:51 np0005592767 nova_compute[182623]: 2026-01-22 22:38:51.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:54 np0005592767 podman[229277]: 2026-01-22 22:38:54.144883906 +0000 UTC m=+0.054968844 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:38:56 np0005592767 nova_compute[182623]: 2026-01-22 22:38:56.653 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:38:56 np0005592767 nova_compute[182623]: 2026-01-22 22:38:56.720 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:00 np0005592767 podman[229297]: 2026-01-22 22:39:00.135072719 +0000 UTC m=+0.053489174 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 17:39:00 np0005592767 podman[229296]: 2026-01-22 22:39:00.183041668 +0000 UTC m=+0.105313270 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 22 17:39:01 np0005592767 nova_compute[182623]: 2026-01-22 22:39:01.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:01 np0005592767 nova_compute[182623]: 2026-01-22 22:39:01.754 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:06 np0005592767 podman[229343]: 2026-01-22 22:39:06.142618462 +0000 UTC m=+0.059596662 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:39:06 np0005592767 podman[229342]: 2026-01-22 22:39:06.143062925 +0000 UTC m=+0.055879580 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:39:06 np0005592767 nova_compute[182623]: 2026-01-22 22:39:06.659 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:06 np0005592767 nova_compute[182623]: 2026-01-22 22:39:06.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:11 np0005592767 nova_compute[182623]: 2026-01-22 22:39:11.662 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:11 np0005592767 nova_compute[182623]: 2026-01-22 22:39:11.806 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:12.111 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:12.112 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:12.112 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:14 np0005592767 podman[229385]: 2026-01-22 22:39:14.141244422 +0000 UTC m=+0.063930263 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:39:16 np0005592767 nova_compute[182623]: 2026-01-22 22:39:16.664 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:16 np0005592767 nova_compute[182623]: 2026-01-22 22:39:16.808 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:21 np0005592767 nova_compute[182623]: 2026-01-22 22:39:21.667 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:21 np0005592767 nova_compute[182623]: 2026-01-22 22:39:21.810 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:21 np0005592767 nova_compute[182623]: 2026-01-22 22:39:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:21 np0005592767 nova_compute[182623]: 2026-01-22 22:39:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:23 np0005592767 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 22 17:39:23 np0005592767 nova_compute[182623]: 2026-01-22 22:39:23.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:24 np0005592767 nova_compute[182623]: 2026-01-22 22:39:24.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:24 np0005592767 nova_compute[182623]: 2026-01-22 22:39:24.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:39:24 np0005592767 nova_compute[182623]: 2026-01-22 22:39:24.959 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:39:25 np0005592767 podman[229411]: 2026-01-22 22:39:25.151349566 +0000 UTC m=+0.063242053 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:39:25 np0005592767 nova_compute[182623]: 2026-01-22 22:39:25.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:26 np0005592767 nova_compute[182623]: 2026-01-22 22:39:26.669 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:26 np0005592767 nova_compute[182623]: 2026-01-22 22:39:26.813 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:26 np0005592767 nova_compute[182623]: 2026-01-22 22:39:26.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:26 np0005592767 nova_compute[182623]: 2026-01-22 22:39:26.998 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:26.999 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.000 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.000 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.279 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.281 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.18887710571289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.282 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.282 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.442 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.443 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.645 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.667 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.700 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.700 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:39:27 np0005592767 nova_compute[182623]: 2026-01-22 22:39:27.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:28 np0005592767 nova_compute[182623]: 2026-01-22 22:39:28.908 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:28 np0005592767 nova_compute[182623]: 2026-01-22 22:39:28.909 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:39:28 np0005592767 nova_compute[182623]: 2026-01-22 22:39:28.935 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:39:29 np0005592767 nova_compute[182623]: 2026-01-22 22:39:29.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:29 np0005592767 nova_compute[182623]: 2026-01-22 22:39:29.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:29 np0005592767 nova_compute[182623]: 2026-01-22 22:39:29.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:39:31 np0005592767 podman[229432]: 2026-01-22 22:39:31.194992301 +0000 UTC m=+0.099214541 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 22 17:39:31 np0005592767 podman[229431]: 2026-01-22 22:39:31.259688964 +0000 UTC m=+0.164738437 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:39:31 np0005592767 nova_compute[182623]: 2026-01-22 22:39:31.674 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:31 np0005592767 nova_compute[182623]: 2026-01-22 22:39:31.815 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:31 np0005592767 nova_compute[182623]: 2026-01-22 22:39:31.918 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:32.255 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:39:32 np0005592767 nova_compute[182623]: 2026-01-22 22:39:32.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:32.257 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:39:34 np0005592767 nova_compute[182623]: 2026-01-22 22:39:34.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:36 np0005592767 nova_compute[182623]: 2026-01-22 22:39:36.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:36 np0005592767 nova_compute[182623]: 2026-01-22 22:39:36.817 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:37 np0005592767 podman[229477]: 2026-01-22 22:39:37.163317266 +0000 UTC m=+0.074147236 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:39:37 np0005592767 podman[229478]: 2026-01-22 22:39:37.182242261 +0000 UTC m=+0.086690814 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.655 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.655 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.678 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.886 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.887 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.899 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:39:40 np0005592767 nova_compute[182623]: 2026-01-22 22:39:40.899 182627 INFO nova.compute.claims [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.138 182627 DEBUG nova.compute.provider_tree [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.153 182627 DEBUG nova.scheduler.client.report [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.176 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.177 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.289 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.290 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.348 182627 INFO nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.415 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.765 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.766 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.767 182627 INFO nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Creating image(s)#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.767 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.768 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.769 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.786 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.818 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.842 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.843 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.843 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.854 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.905 182627 DEBUG nova.policy [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b15fdf3e23640a2b9579790941bb346', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abdd987d004046138277253df8658aca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.908 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.909 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.944 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.945 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:41 np0005592767 nova_compute[182623]: 2026-01-22 22:39:41.946 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.017 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.018 182627 DEBUG nova.virt.disk.api [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Checking if we can resize image /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.018 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.111 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.112 182627 DEBUG nova.virt.disk.api [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Cannot resize image /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.113 182627 DEBUG nova.objects.instance [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'migration_context' on Instance uuid d2f60ca9-6484-4117-9f84-43529005cdab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.149 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.150 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Ensure instance console log exists: /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.150 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.151 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:42 np0005592767 nova_compute[182623]: 2026-01-22 22:39:42.151 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:42.261 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:43 np0005592767 nova_compute[182623]: 2026-01-22 22:39:43.164 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Successfully created port: 984273cd-97d3-49d1-a7f5-75e068bf8f8c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:39:45 np0005592767 podman[229532]: 2026-01-22 22:39:45.150351025 +0000 UTC m=+0.068470698 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.211 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Successfully updated port: 984273cd-97d3-49d1-a7f5-75e068bf8f8c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.243 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.244 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.244 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.399 182627 DEBUG nova.compute.manager [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received event network-changed-984273cd-97d3-49d1-a7f5-75e068bf8f8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.400 182627 DEBUG nova.compute.manager [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Refreshing instance network info cache due to event network-changed-984273cd-97d3-49d1-a7f5-75e068bf8f8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.401 182627 DEBUG oslo_concurrency.lockutils [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:39:45 np0005592767 nova_compute[182623]: 2026-01-22 22:39:45.643 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:39:46 np0005592767 nova_compute[182623]: 2026-01-22 22:39:46.687 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:46 np0005592767 nova_compute[182623]: 2026-01-22 22:39:46.821 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:46 np0005592767 nova_compute[182623]: 2026-01-22 22:39:46.983 182627 DEBUG nova.network.neutron [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Updating instance_info_cache with network_info: [{"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.010 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.010 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance network_info: |[{"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.011 182627 DEBUG oslo_concurrency.lockutils [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.012 182627 DEBUG nova.network.neutron [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Refreshing network info cache for port 984273cd-97d3-49d1-a7f5-75e068bf8f8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.017 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Start _get_guest_xml network_info=[{"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.026 182627 WARNING nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.033 182627 DEBUG nova.virt.libvirt.host [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.036 182627 DEBUG nova.virt.libvirt.host [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.050 182627 DEBUG nova.virt.libvirt.host [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.051 182627 DEBUG nova.virt.libvirt.host [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.053 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.054 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.055 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.055 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.056 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.056 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.057 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.057 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.058 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.058 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.059 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.059 182627 DEBUG nova.virt.hardware [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.067 182627 DEBUG nova.virt.libvirt.vif [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-617516782',display_name='tempest-ServerActionsTestOtherB-server-617516782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-617516782',id=131,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-qo04c450',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:41Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=d2f60ca9-6484-4117-9f84-43529005cdab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.067 182627 DEBUG nova.network.os_vif_util [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.069 182627 DEBUG nova.network.os_vif_util [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.071 182627 DEBUG nova.objects.instance [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'pci_devices' on Instance uuid d2f60ca9-6484-4117-9f84-43529005cdab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.095 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <uuid>d2f60ca9-6484-4117-9f84-43529005cdab</uuid>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <name>instance-00000083</name>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerActionsTestOtherB-server-617516782</nova:name>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:39:47</nova:creationTime>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:user uuid="8b15fdf3e23640a2b9579790941bb346">tempest-ServerActionsTestOtherB-1598778832-project-member</nova:user>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:project uuid="abdd987d004046138277253df8658aca">tempest-ServerActionsTestOtherB-1598778832</nova:project>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        <nova:port uuid="984273cd-97d3-49d1-a7f5-75e068bf8f8c">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="serial">d2f60ca9-6484-4117-9f84-43529005cdab</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="uuid">d2f60ca9-6484-4117-9f84-43529005cdab</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.config"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:af:cc:63"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <target dev="tap984273cd-97"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/console.log" append="off"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:39:47 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:39:47 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:39:47 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:39:47 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.098 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Preparing to wait for external event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.099 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.099 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.100 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.101 182627 DEBUG nova.virt.libvirt.vif [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-617516782',display_name='tempest-ServerActionsTestOtherB-server-617516782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-617516782',id=131,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-qo04c450',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:39:41Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=d2f60ca9-6484-4117-9f84-43529005cdab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.102 182627 DEBUG nova.network.os_vif_util [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.103 182627 DEBUG nova.network.os_vif_util [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.104 182627 DEBUG os_vif [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.106 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.107 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.112 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap984273cd-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.113 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap984273cd-97, col_values=(('external_ids', {'iface-id': '984273cd-97d3-49d1-a7f5-75e068bf8f8c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:cc:63', 'vm-uuid': 'd2f60ca9-6484-4117-9f84-43529005cdab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:47 np0005592767 NetworkManager[54973]: <info>  [1769121587.1177] manager: (tap984273cd-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.128 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.130 182627 INFO os_vif [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97')#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.202 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.203 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.204 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] No VIF found with MAC fa:16:3e:af:cc:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:39:47 np0005592767 nova_compute[182623]: 2026-01-22 22:39:47.205 182627 INFO nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Using config drive#033[00m
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.362 182627 INFO nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Creating config drive at /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.config#033[00m
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.381 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_s6xc3k0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.522 182627 DEBUG oslo_concurrency.processutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_s6xc3k0" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:48 np0005592767 kernel: tap984273cd-97: entered promiscuous mode
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.6241] manager: (tap984273cd-97): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.667 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:48Z|00484|binding|INFO|Claiming lport 984273cd-97d3-49d1-a7f5-75e068bf8f8c for this chassis.
Jan 22 17:39:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:48Z|00485|binding|INFO|984273cd-97d3-49d1-a7f5-75e068bf8f8c: Claiming fa:16:3e:af:cc:63 10.100.0.13
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.679 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 systemd-machined[153912]: New machine qemu-64-instance-00000083.
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.6989] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.697 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.6999] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.714 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:cc:63 10.100.0.13'], port_security=['fa:16:3e:af:cc:63 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd2f60ca9-6484-4117-9f84-43529005cdab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2d993940-8666-43d7-8759-418fc1311e0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=984273cd-97d3-49d1-a7f5-75e068bf8f8c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.715 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 984273cd-97d3-49d1-a7f5-75e068bf8f8c in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 bound to our chassis#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.717 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 84d8b010-d968-4df4-bedf-0c350ae42113#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.733 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e92b725-cb5f-4979-bb0c-a5dd1b392db1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.734 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap84d8b010-d1 in ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.736 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap84d8b010-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.736 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eda89645-4020-4099-b8aa-3d23ce15b0e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.737 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8179be-fc55-4fe4-ad1c-327f8f8dbf30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.761 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a030cd-86cc-48ad-b354-1e8bfe0577e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 systemd[1]: Started Virtual Machine qemu-64-instance-00000083.
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.795 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[99372758-0543-4ec6-aeff-7937b59130c8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 systemd-udevd[229580]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.8257] device (tap984273cd-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.8267] device (tap984273cd-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.871 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf1ce31-37f9-4c1f-9f24-fe6f4cce75ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 systemd-udevd[229584]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.8816] manager: (tap84d8b010-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.881 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2201b63e-924d-4674-8691-f542161319aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:48Z|00486|binding|INFO|Setting lport 984273cd-97d3-49d1-a7f5-75e068bf8f8c ovn-installed in OVS
Jan 22 17:39:48 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:48Z|00487|binding|INFO|Setting lport 984273cd-97d3-49d1-a7f5-75e068bf8f8c up in Southbound
Jan 22 17:39:48 np0005592767 nova_compute[182623]: 2026-01-22 22:39:48.925 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.940 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[31b0f2c9-cd5c-457c-af11-c5d7b16b48fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.943 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0b6ee9-11a9-41cb-80dc-07f44f20bf3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:48 np0005592767 NetworkManager[54973]: <info>  [1769121588.9739] device (tap84d8b010-d0): carrier: link connected
Jan 22 17:39:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:48.981 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[01cb6a78-a05a-4634-8d14-f9fe35f15c26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.002 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[efe6c69b-2d85-4546-b257-5e007335a77d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518555, 'reachable_time': 28198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229616, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.026 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[38618964-2db8-4a6f-b7a0-97b551120fa3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:3d39'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518555, 'tstamp': 518555}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229617, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.062 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3bd0ed-937e-4057-8604-e364402d7177]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap84d8b010-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:3d:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518555, 'reachable_time': 28198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229619, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.088 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121589.0869772, d2f60ca9-6484-4117-9f84-43529005cdab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.089 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] VM Started (Lifecycle Event)#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.117 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0175e208-272a-4735-bda5-82e226098fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.126 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.132 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121589.0905044, d2f60ca9-6484-4117-9f84-43529005cdab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.132 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.164 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.170 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.201 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.230 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f92fad48-5454-4c80-90ff-b0642bf7d180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.232 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.233 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.234 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84d8b010-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:49 np0005592767 NetworkManager[54973]: <info>  [1769121589.2385] manager: (tap84d8b010-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 22 17:39:49 np0005592767 kernel: tap84d8b010-d0: entered promiscuous mode
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.238 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.244 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap84d8b010-d0, col_values=(('external_ids', {'iface-id': '8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:49Z|00488|binding|INFO|Releasing lport 8ac0fd58-0c46-43d2-8dae-bbc51d1be8f8 from this chassis (sb_readonly=0)
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.247 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.248 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.249 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.251 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef4f254-8f3b-4394-b0d6-b76b1efb1b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.252 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/84d8b010-d968-4df4-bedf-0c350ae42113.pid.haproxy
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 84d8b010-d968-4df4-bedf-0c350ae42113
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:39:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:49.254 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'env', 'PROCESS_TAG=haproxy-84d8b010-d968-4df4-bedf-0c350ae42113', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/84d8b010-d968-4df4-bedf-0c350ae42113.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.273 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.460 182627 DEBUG nova.compute.manager [req-0bb4e62c-ea72-4aed-a5a3-33d33dfa6e8b req-99e4053a-9c72-48fb-acd6-f42289b356d6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.460 182627 DEBUG oslo_concurrency.lockutils [req-0bb4e62c-ea72-4aed-a5a3-33d33dfa6e8b req-99e4053a-9c72-48fb-acd6-f42289b356d6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.461 182627 DEBUG oslo_concurrency.lockutils [req-0bb4e62c-ea72-4aed-a5a3-33d33dfa6e8b req-99e4053a-9c72-48fb-acd6-f42289b356d6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.461 182627 DEBUG oslo_concurrency.lockutils [req-0bb4e62c-ea72-4aed-a5a3-33d33dfa6e8b req-99e4053a-9c72-48fb-acd6-f42289b356d6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.461 182627 DEBUG nova.compute.manager [req-0bb4e62c-ea72-4aed-a5a3-33d33dfa6e8b req-99e4053a-9c72-48fb-acd6-f42289b356d6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Processing event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.462 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.471 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.472 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121589.4713008, d2f60ca9-6484-4117-9f84-43529005cdab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.473 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.477 182627 INFO nova.virt.libvirt.driver [-] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance spawned successfully.#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.478 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.500 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.510 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.515 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.516 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.517 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.517 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.518 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.519 182627 DEBUG nova.virt.libvirt.driver [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.578 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.645 182627 INFO nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Took 7.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.646 182627 DEBUG nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:49 np0005592767 podman[229650]: 2026-01-22 22:39:49.687765587 +0000 UTC m=+0.060117938 container create 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.708 182627 DEBUG nova.network.neutron [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Updated VIF entry in instance network info cache for port 984273cd-97d3-49d1-a7f5-75e068bf8f8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.708 182627 DEBUG nova.network.neutron [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Updating instance_info_cache with network_info: [{"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:39:49 np0005592767 systemd[1]: Started libpod-conmon-73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918.scope.
Jan 22 17:39:49 np0005592767 podman[229650]: 2026-01-22 22:39:49.657790516 +0000 UTC m=+0.030142897 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.760 182627 DEBUG oslo_concurrency.lockutils [req-18297fc2-ee02-4a1e-bec8-23720e194683 req-fdfe2456-49bf-4e6b-9f7b-e5c77af9d9f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:39:49 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:39:49 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6537f4607e3ee7bc78d13dce06f349fe6593adbc74f6e997cb857549440a2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.774 182627 INFO nova.compute.manager [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Took 8.97 seconds to build instance.#033[00m
Jan 22 17:39:49 np0005592767 podman[229650]: 2026-01-22 22:39:49.783584992 +0000 UTC m=+0.155937363 container init 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:39:49 np0005592767 nova_compute[182623]: 2026-01-22 22:39:49.792 182627 DEBUG oslo_concurrency.lockutils [None req-b779bde1-886e-4e9f-a9c5-2183319bcd5f 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:49 np0005592767 podman[229650]: 2026-01-22 22:39:49.793736913 +0000 UTC m=+0.166089264 container start 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 17:39:49 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [NOTICE]   (229669) : New worker (229671) forked
Jan 22 17:39:49 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [NOTICE]   (229669) : Loading success.
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.565 182627 DEBUG nova.compute.manager [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.565 182627 DEBUG oslo_concurrency.lockutils [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.566 182627 DEBUG oslo_concurrency.lockutils [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.566 182627 DEBUG oslo_concurrency.lockutils [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.566 182627 DEBUG nova.compute.manager [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] No waiting events found dispatching network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.567 182627 WARNING nova.compute.manager [req-d5b0a0a7-3a17-46b2-bf23-f03b5185fa09 req-dc2faaf1-a64a-4779-9852-8d0d1beaf06e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received unexpected event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c for instance with vm_state active and task_state None.#033[00m
Jan 22 17:39:51 np0005592767 nova_compute[182623]: 2026-01-22 22:39:51.824 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.181 182627 INFO nova.compute.manager [None req-44163279-7874-43c7-93f3-d8e538be0e8c 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Pausing#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.182 182627 DEBUG nova.objects.instance [None req-44163279-7874-43c7-93f3-d8e538be0e8c 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'flavor' on Instance uuid d2f60ca9-6484-4117-9f84-43529005cdab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.227 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121592.2275229, d2f60ca9-6484-4117-9f84-43529005cdab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.228 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.230 182627 DEBUG nova.compute.manager [None req-44163279-7874-43c7-93f3-d8e538be0e8c 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.255 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.258 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:39:52 np0005592767 nova_compute[182623]: 2026-01-22 22:39:52.285 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 22 17:39:53 np0005592767 nova_compute[182623]: 2026-01-22 22:39:53.673 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:39:56 np0005592767 podman[229680]: 2026-01-22 22:39:56.194981068 +0000 UTC m=+0.111959123 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:39:56 np0005592767 nova_compute[182623]: 2026-01-22 22:39:56.826 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:56.999 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.000 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.001 182627 INFO nova.compute.manager [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Shelving#033[00m
Jan 22 17:39:57 np0005592767 kernel: tap984273cd-97 (unregistering): left promiscuous mode
Jan 22 17:39:57 np0005592767 NetworkManager[54973]: <info>  [1769121597.0666] device (tap984273cd-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:39:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:57Z|00489|binding|INFO|Releasing lport 984273cd-97d3-49d1-a7f5-75e068bf8f8c from this chassis (sb_readonly=0)
Jan 22 17:39:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:57Z|00490|binding|INFO|Setting lport 984273cd-97d3-49d1-a7f5-75e068bf8f8c down in Southbound
Jan 22 17:39:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:39:57Z|00491|binding|INFO|Removing iface tap984273cd-97 ovn-installed in OVS
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.077 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.088 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:cc:63 10.100.0.13'], port_security=['fa:16:3e:af:cc:63 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd2f60ca9-6484-4117-9f84-43529005cdab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-84d8b010-d968-4df4-bedf-0c350ae42113', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'abdd987d004046138277253df8658aca', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d993940-8666-43d7-8759-418fc1311e0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2acacb93-e9c9-470a-a730-8ade0736629d, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=984273cd-97d3-49d1-a7f5-75e068bf8f8c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.090 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 984273cd-97d3-49d1-a7f5-75e068bf8f8c in datapath 84d8b010-d968-4df4-bedf-0c350ae42113 unbound from our chassis#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.094 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 84d8b010-d968-4df4-bedf-0c350ae42113, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.096 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[634ea500-b260-40f2-952a-012a42637e57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.097 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 namespace which is not needed anymore#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.096 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 22 17:39:57 np0005592767 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000083.scope: Consumed 3.063s CPU time.
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 systemd-machined[153912]: Machine qemu-64-instance-00000083 terminated.
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [NOTICE]   (229669) : haproxy version is 2.8.14-c23fe91
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [NOTICE]   (229669) : path to executable is /usr/sbin/haproxy
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [WARNING]  (229669) : Exiting Master process...
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [WARNING]  (229669) : Exiting Master process...
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [ALERT]    (229669) : Current worker (229671) exited with code 143 (Terminated)
Jan 22 17:39:57 np0005592767 neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113[229665]: [WARNING]  (229669) : All workers exited. Exiting... (0)
Jan 22 17:39:57 np0005592767 systemd[1]: libpod-73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918.scope: Deactivated successfully.
Jan 22 17:39:57 np0005592767 conmon[229665]: conmon 73db7fbde8ba0d3d07d0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918.scope/container/memory.events
Jan 22 17:39:57 np0005592767 podman[229726]: 2026-01-22 22:39:57.299066534 +0000 UTC m=+0.065601909 container died 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:39:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918-userdata-shm.mount: Deactivated successfully.
Jan 22 17:39:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay-1a6537f4607e3ee7bc78d13dce06f349fe6593adbc74f6e997cb857549440a2f-merged.mount: Deactivated successfully.
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.343 182627 INFO nova.virt.libvirt.driver [-] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance destroyed successfully.#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.343 182627 DEBUG nova.objects.instance [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'numa_topology' on Instance uuid d2f60ca9-6484-4117-9f84-43529005cdab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:39:57 np0005592767 podman[229726]: 2026-01-22 22:39:57.35197676 +0000 UTC m=+0.118512115 container cleanup 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:39:57 np0005592767 systemd[1]: libpod-conmon-73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918.scope: Deactivated successfully.
Jan 22 17:39:57 np0005592767 podman[229771]: 2026-01-22 22:39:57.438169529 +0000 UTC m=+0.059232892 container remove 73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.448 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[54a93266-c88e-4e79-b798-1ab6ebaee957]: (4, ('Thu Jan 22 10:39:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918)\n73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918\nThu Jan 22 10:39:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 (73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918)\n73db7fbde8ba0d3d07d0d07b804dffa5fc67461aca5d1f7ee4333c3bb404c918\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.451 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75056637-0fc5-4e86-9368-282948594fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.453 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84d8b010-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.495 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 kernel: tap84d8b010-d0: left promiscuous mode
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.516 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.521 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[596e3d41-4a27-4a8a-90db-0582b7e05998]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.543 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7e539c66-2c2b-4e89-b8e8-e7044fbf5251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.545 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0ded25-b223-4ee6-8f15-6fb04b8836cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.575 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[93087ba3-e979-4eb0-8611-8caa903f4ada]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518543, 'reachable_time': 22667, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229791, 'error': None, 'target': 'ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.578 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-84d8b010-d968-4df4-bedf-0c350ae42113 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:39:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:39:57.579 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f3615a69-0e8a-44c0-8dbf-044461780cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:39:57 np0005592767 systemd[1]: run-netns-ovnmeta\x2d84d8b010\x2dd968\x2d4df4\x2dbedf\x2d0c350ae42113.mount: Deactivated successfully.
Jan 22 17:39:57 np0005592767 nova_compute[182623]: 2026-01-22 22:39:57.944 182627 INFO nova.virt.libvirt.driver [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Beginning cold snapshot process#033[00m
Jan 22 17:39:58 np0005592767 nova_compute[182623]: 2026-01-22 22:39:58.263 182627 DEBUG nova.privsep.utils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:39:58 np0005592767 nova_compute[182623]: 2026-01-22 22:39:58.264 182627 DEBUG oslo_concurrency.processutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk /var/lib/nova/instances/snapshots/tmpiqffpgom/26208d1e90b948288c58d0ea3914e1e1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:39:58 np0005592767 nova_compute[182623]: 2026-01-22 22:39:58.455 182627 DEBUG oslo_concurrency.processutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab/disk /var/lib/nova/instances/snapshots/tmpiqffpgom/26208d1e90b948288c58d0ea3914e1e1" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:39:58 np0005592767 nova_compute[182623]: 2026-01-22 22:39:58.457 182627 INFO nova.virt.libvirt.driver [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.371 182627 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received event network-vif-unplugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.372 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.372 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.373 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.373 182627 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] No waiting events found dispatching network-vif-unplugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.373 182627 WARNING nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received unexpected event network-vif-unplugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.373 182627 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.373 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.374 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.374 182627 DEBUG oslo_concurrency.lockutils [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.374 182627 DEBUG nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] No waiting events found dispatching network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:39:59 np0005592767 nova_compute[182623]: 2026-01-22 22:39:59.374 182627 WARNING nova.compute.manager [req-0f7c711f-bb15-4f13-8906-73e5d31086dd req-000cc48f-08c4-49b7-ad4b-01e4014224bf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Received unexpected event network-vif-plugged-984273cd-97d3-49d1-a7f5-75e068bf8f8c for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.815 182627 INFO nova.virt.libvirt.driver [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Snapshot image upload complete#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.816 182627 DEBUG nova.compute.manager [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.829 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.917 182627 INFO nova.compute.manager [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Shelve offloading#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.935 182627 INFO nova.virt.libvirt.driver [-] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance destroyed successfully.#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.935 182627 DEBUG nova.compute.manager [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.938 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.939 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquired lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:40:01 np0005592767 nova_compute[182623]: 2026-01-22 22:40:01.939 182627 DEBUG nova.network.neutron [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:40:02 np0005592767 nova_compute[182623]: 2026-01-22 22:40:02.122 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:02 np0005592767 podman[229801]: 2026-01-22 22:40:02.161875744 +0000 UTC m=+0.072548501 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public)
Jan 22 17:40:02 np0005592767 podman[229800]: 2026-01-22 22:40:02.200176766 +0000 UTC m=+0.106535544 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 22 17:40:03 np0005592767 nova_compute[182623]: 2026-01-22 22:40:03.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:03 np0005592767 nova_compute[182623]: 2026-01-22 22:40:03.243 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:04 np0005592767 nova_compute[182623]: 2026-01-22 22:40:04.349 182627 DEBUG nova.network.neutron [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Updating instance_info_cache with network_info: [{"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:40:04 np0005592767 nova_compute[182623]: 2026-01-22 22:40:04.386 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Releasing lock "refresh_cache-d2f60ca9-6484-4117-9f84-43529005cdab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.491 182627 INFO nova.virt.libvirt.driver [-] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Instance destroyed successfully.#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.492 182627 DEBUG nova.objects.instance [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lazy-loading 'resources' on Instance uuid d2f60ca9-6484-4117-9f84-43529005cdab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.525 182627 DEBUG nova.virt.libvirt.vif [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-617516782',display_name='tempest-ServerActionsTestOtherB-server-617516782',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-617516782',id=131,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='abdd987d004046138277253df8658aca',ramdisk_id='',reservation_id='r-qo04c450',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1598778832',owner_user_name='tempest-ServerActionsTestOtherB-1598778832-project-member',shelved_at='2026-01-22T22:40:01.816612',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='eccfdd59-6583-4c4b-826c-b82daef59ed5'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:39:58Z,user_data=None,user_id='8b15fdf3e23640a2b9579790941bb346',uuid=d2f60ca9-6484-4117-9f84-43529005cdab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.526 182627 DEBUG nova.network.os_vif_util [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converting VIF {"id": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "address": "fa:16:3e:af:cc:63", "network": {"id": "84d8b010-d968-4df4-bedf-0c350ae42113", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-212091580-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abdd987d004046138277253df8658aca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap984273cd-97", "ovs_interfaceid": "984273cd-97d3-49d1-a7f5-75e068bf8f8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.528 182627 DEBUG nova.network.os_vif_util [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.529 182627 DEBUG os_vif [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.532 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.533 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap984273cd-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.535 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.538 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.541 182627 INFO os_vif [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:cc:63,bridge_name='br-int',has_traffic_filtering=True,id=984273cd-97d3-49d1-a7f5-75e068bf8f8c,network=Network(84d8b010-d968-4df4-bedf-0c350ae42113),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap984273cd-97')#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.542 182627 INFO nova.virt.libvirt.driver [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Deleting instance files /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab_del#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.543 182627 INFO nova.virt.libvirt.driver [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Deletion of /var/lib/nova/instances/d2f60ca9-6484-4117-9f84-43529005cdab_del complete#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.671 182627 INFO nova.scheduler.client.report [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Deleted allocations for instance d2f60ca9-6484-4117-9f84-43529005cdab#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.802 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.802 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.849 182627 DEBUG nova.compute.provider_tree [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.864 182627 DEBUG nova.scheduler.client.report [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.891 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:05 np0005592767 nova_compute[182623]: 2026-01-22 22:40:05.977 182627 DEBUG oslo_concurrency.lockutils [None req-56c4e519-896a-4ede-b5a3-1e01a75be0fd 8b15fdf3e23640a2b9579790941bb346 abdd987d004046138277253df8658aca - - default default] Lock "d2f60ca9-6484-4117-9f84-43529005cdab" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:06 np0005592767 nova_compute[182623]: 2026-01-22 22:40:06.830 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:40:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:40:08 np0005592767 podman[229852]: 2026-01-22 22:40:08.141598886 +0000 UTC m=+0.055558551 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:40:08 np0005592767 podman[229851]: 2026-01-22 22:40:08.152052415 +0000 UTC m=+0.065144006 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:40:10 np0005592767 nova_compute[182623]: 2026-01-22 22:40:10.537 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:12.112 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:12.113 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:12.113 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:12 np0005592767 nova_compute[182623]: 2026-01-22 22:40:12.193 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:12 np0005592767 nova_compute[182623]: 2026-01-22 22:40:12.336 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121597.33466, d2f60ca9-6484-4117-9f84-43529005cdab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:12 np0005592767 nova_compute[182623]: 2026-01-22 22:40:12.336 182627 INFO nova.compute.manager [-] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:40:12 np0005592767 nova_compute[182623]: 2026-01-22 22:40:12.365 182627 DEBUG nova.compute.manager [None req-fb6324b3-b132-4590-9aba-d0d295c2b949 - - - - - -] [instance: d2f60ca9-6484-4117-9f84-43529005cdab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:15 np0005592767 nova_compute[182623]: 2026-01-22 22:40:15.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:16 np0005592767 podman[229894]: 2026-01-22 22:40:16.151805578 +0000 UTC m=+0.057317859 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:40:16 np0005592767 nova_compute[182623]: 2026-01-22 22:40:16.834 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:20 np0005592767 nova_compute[182623]: 2026-01-22 22:40:20.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:20 np0005592767 nova_compute[182623]: 2026-01-22 22:40:20.669 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:21 np0005592767 nova_compute[182623]: 2026-01-22 22:40:21.836 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:23 np0005592767 nova_compute[182623]: 2026-01-22 22:40:23.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:23 np0005592767 nova_compute[182623]: 2026-01-22 22:40:23.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.467 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.468 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.493 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.621 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.622 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.633 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.633 182627 INFO nova.compute.claims [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.809 182627 DEBUG nova.compute.provider_tree [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.829 182627 DEBUG nova.scheduler.client.report [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.874 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.875 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.929 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.930 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.949 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.950 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:40:24 np0005592767 nova_compute[182623]: 2026-01-22 22:40:24.988 182627 INFO nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.018 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.139 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.142 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.143 182627 INFO nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating image(s)#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.144 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.144 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.146 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.177 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.231 182627 DEBUG nova.policy [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839eb51e89b14157b8da40ae1b480ef3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.277 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.279 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.280 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.307 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.407 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.409 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.458 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.459 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.460 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.530 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.531 182627 DEBUG nova.virt.disk.api [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.531 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.551 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.591 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.592 182627 DEBUG nova.virt.disk.api [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.593 182627 DEBUG nova.objects.instance [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.612 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.613 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Ensure instance console log exists: /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.613 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.614 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.614 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:25 np0005592767 nova_compute[182623]: 2026-01-22 22:40:25.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.757 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Successfully created port: 4c730410-eed9-46ca-b9e0-5ba95117cece _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.838 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:26 np0005592767 nova_compute[182623]: 2026-01-22 22:40:26.927 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:40:27 np0005592767 podman[229934]: 2026-01-22 22:40:27.078134227 +0000 UTC m=+0.099658493 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.170 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.172 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5701MB free_disk=73.1885986328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.172 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.173 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.252 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 3bede382-d3d0-4053-a98e-0add602d4f2f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.253 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.253 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.317 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.338 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.375 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:40:27 np0005592767 nova_compute[182623]: 2026-01-22 22:40:27.376 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.376 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.376 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.686 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Successfully updated port: 4c730410-eed9-46ca-b9e0-5ba95117cece _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.704 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.705 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:40:28 np0005592767 nova_compute[182623]: 2026-01-22 22:40:28.705 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:40:29 np0005592767 nova_compute[182623]: 2026-01-22 22:40:29.340 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:40:29 np0005592767 nova_compute[182623]: 2026-01-22 22:40:29.753 182627 DEBUG nova.compute.manager [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:40:29 np0005592767 nova_compute[182623]: 2026-01-22 22:40:29.754 182627 DEBUG nova.compute.manager [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing instance network info cache due to event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:40:29 np0005592767 nova_compute[182623]: 2026-01-22 22:40:29.754 182627 DEBUG oslo_concurrency.lockutils [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:40:29 np0005592767 nova_compute[182623]: 2026-01-22 22:40:29.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:30 np0005592767 nova_compute[182623]: 2026-01-22 22:40:30.555 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.376 182627 DEBUG nova.network.neutron [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updating instance_info_cache with network_info: [{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.420 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.420 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance network_info: |[{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.421 182627 DEBUG oslo_concurrency.lockutils [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.421 182627 DEBUG nova.network.neutron [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.424 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start _get_guest_xml network_info=[{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.429 182627 WARNING nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.434 182627 DEBUG nova.virt.libvirt.host [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.435 182627 DEBUG nova.virt.libvirt.host [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.442 182627 DEBUG nova.virt.libvirt.host [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.443 182627 DEBUG nova.virt.libvirt.host [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.444 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.444 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.445 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.445 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.445 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.445 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.446 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.446 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.447 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.447 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.447 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.447 182627 DEBUG nova.virt.hardware [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.452 182627 DEBUG nova.virt.libvirt.vif [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:40:25Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.452 182627 DEBUG nova.network.os_vif_util [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.453 182627 DEBUG nova.network.os_vif_util [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.454 182627 DEBUG nova.objects.instance [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.471 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <uuid>3bede382-d3d0-4053-a98e-0add602d4f2f</uuid>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <name>instance-00000086</name>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-598973057</nova:name>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:40:31</nova:creationTime>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        <nova:port uuid="4c730410-eed9-46ca-b9e0-5ba95117cece">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="serial">3bede382-d3d0-4053-a98e-0add602d4f2f</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="uuid">3bede382-d3d0-4053-a98e-0add602d4f2f</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:1e:ce:2c"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <target dev="tap4c730410-ee"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/console.log" append="off"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:40:31 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:40:31 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:40:31 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:40:31 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.472 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Preparing to wait for external event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.473 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.473 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.473 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.474 182627 DEBUG nova.virt.libvirt.vif [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:40:25Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.474 182627 DEBUG nova.network.os_vif_util [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.475 182627 DEBUG nova.network.os_vif_util [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.475 182627 DEBUG os_vif [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.476 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.476 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.476 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.481 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.481 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c730410-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.482 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c730410-ee, col_values=(('external_ids', {'iface-id': '4c730410-eed9-46ca-b9e0-5ba95117cece', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:ce:2c', 'vm-uuid': '3bede382-d3d0-4053-a98e-0add602d4f2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.484 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:31 np0005592767 NetworkManager[54973]: <info>  [1769121631.4858] manager: (tap4c730410-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.486 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.494 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.495 182627 INFO os_vif [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee')#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.547 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.548 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.548 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:1e:ce:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.549 182627 INFO nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Using config drive#033[00m
Jan 22 17:40:31 np0005592767 nova_compute[182623]: 2026-01-22 22:40:31.842 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.153 182627 INFO nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating config drive at /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.157 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjcqef04y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.295 182627 DEBUG oslo_concurrency.processutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjcqef04y" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:32 np0005592767 kernel: tap4c730410-ee: entered promiscuous mode
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.3747] manager: (tap4c730410-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 22 17:40:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:32Z|00492|binding|INFO|Claiming lport 4c730410-eed9-46ca-b9e0-5ba95117cece for this chassis.
Jan 22 17:40:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:32Z|00493|binding|INFO|4c730410-eed9-46ca-b9e0-5ba95117cece: Claiming fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.413 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.417 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.430 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ce:2c 10.100.0.10'], port_security=['fa:16:3e:1e:ce:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3bede382-d3d0-4053-a98e-0add602d4f2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f0fd7437-d7e3-47f0-89a5-286033fb63ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f907ca0-d52f-4aef-b032-e68256e0ba9f, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4c730410-eed9-46ca-b9e0-5ba95117cece) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.431 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4c730410-eed9-46ca-b9e0-5ba95117cece in datapath 09ec689a-3640-4bd8-88d1-a8c54c02874e bound to our chassis#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.433 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09ec689a-3640-4bd8-88d1-a8c54c02874e#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.444 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7dc08e-f898-4847-91b6-c44ae5237400]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.445 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09ec689a-31 in ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.448 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09ec689a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.448 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5aeb84e6-c323-490e-aef1-70ccf156c043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.448 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2e68d9-ff5a-4b46-b336-e4ed4c7c984b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 systemd-udevd[230013]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.460 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[58b0bde4-696e-4d4a-ab0d-592165875d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 podman[229965]: 2026-01-22 22:40:32.46426512 +0000 UTC m=+0.124660068 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=)
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.4758] device (tap4c730410-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.4767] device (tap4c730410-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:40:32 np0005592767 systemd-machined[153912]: New machine qemu-65-instance-00000086.
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.486 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e167374b-5b2d-4bf8-ac16-e2de1e21c77f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:32Z|00494|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece ovn-installed in OVS
Jan 22 17:40:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:32Z|00495|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece up in Southbound
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.489 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 systemd[1]: Started Virtual Machine qemu-65-instance-00000086.
Jan 22 17:40:32 np0005592767 podman[229961]: 2026-01-22 22:40:32.503750584 +0000 UTC m=+0.161758485 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.515 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[92018f2c-0396-49a4-931a-7c80dcbc6231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.520 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[35b006fc-8faa-4481-b36f-a9cad0814e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.5213] manager: (tap09ec689a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.547 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd18ccc-b81c-49c1-8a89-f88b353ad939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.550 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ab685b19-d284-4783-b3a3-98dcb92bab09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.5701] device (tap09ec689a-30): carrier: link connected
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.574 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8bfff909-7fa5-422e-bcbe-4d7f96bcc619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.590 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[762873ab-6c7e-486e-a032-524bbfa424f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09ec689a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:4b:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522914, 'reachable_time': 15812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230054, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.603 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae710bd-9d8d-4795-a76a-eed2836d4406]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:4bf7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522914, 'tstamp': 522914}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230055, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.616 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d40d64-ba0e-4d6f-b3dd-49bc278be648]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09ec689a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:4b:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522914, 'reachable_time': 15812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230056, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.646 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[051c2fae-064d-4d8d-941d-c94dae70d7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.707 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[324b0346-ec16-49af-8363-d34308b62632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.708 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09ec689a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.709 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.709 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09ec689a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:32 np0005592767 NetworkManager[54973]: <info>  [1769121632.7118] manager: (tap09ec689a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.712 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 kernel: tap09ec689a-30: entered promiscuous mode
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.715 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09ec689a-30, col_values=(('external_ids', {'iface-id': '3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:32Z|00496|binding|INFO|Releasing lport 3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61 from this chassis (sb_readonly=0)
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.718 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.719 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa0c431-1cb3-42f7-b7a5-0fd6a5e5ed6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.719 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-09ec689a-3640-4bd8-88d1-a8c54c02874e
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 09ec689a-3640-4bd8-88d1-a8c54c02874e
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:40:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:32.720 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'env', 'PROCESS_TAG=haproxy-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09ec689a-3640-4bd8-88d1-a8c54c02874e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.735 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.811 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121632.8109198, 3bede382-d3d0-4053-a98e-0add602d4f2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.812 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Started (Lifecycle Event)#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.853 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.858 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121632.811286, 3bede382-d3d0-4053-a98e-0add602d4f2f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.858 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.882 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.890 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.911 182627 DEBUG nova.compute.manager [req-ed03618c-dfd3-41f2-810f-3f13728347d1 req-c3b506e3-5a39-4cdf-81f8-ec76434e3972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.912 182627 DEBUG oslo_concurrency.lockutils [req-ed03618c-dfd3-41f2-810f-3f13728347d1 req-c3b506e3-5a39-4cdf-81f8-ec76434e3972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.912 182627 DEBUG oslo_concurrency.lockutils [req-ed03618c-dfd3-41f2-810f-3f13728347d1 req-c3b506e3-5a39-4cdf-81f8-ec76434e3972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.913 182627 DEBUG oslo_concurrency.lockutils [req-ed03618c-dfd3-41f2-810f-3f13728347d1 req-c3b506e3-5a39-4cdf-81f8-ec76434e3972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.913 182627 DEBUG nova.compute.manager [req-ed03618c-dfd3-41f2-810f-3f13728347d1 req-c3b506e3-5a39-4cdf-81f8-ec76434e3972 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Processing event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.914 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.916 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.925 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.926 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121632.9259293, 3bede382-d3d0-4053-a98e-0add602d4f2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.926 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.933 182627 INFO nova.virt.libvirt.driver [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance spawned successfully.#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.934 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.945 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.951 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.954 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.955 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.955 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.956 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.957 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.957 182627 DEBUG nova.virt.libvirt.driver [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:32 np0005592767 nova_compute[182623]: 2026-01-22 22:40:32.980 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.087 182627 INFO nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Took 7.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.088 182627 DEBUG nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:33 np0005592767 podman[230095]: 2026-01-22 22:40:33.125338163 +0000 UTC m=+0.058734628 container create fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:40:33 np0005592767 systemd[1]: Started libpod-conmon-fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42.scope.
Jan 22 17:40:33 np0005592767 podman[230095]: 2026-01-22 22:40:33.093102894 +0000 UTC m=+0.026499329 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.202 182627 INFO nova.compute.manager [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Took 8.63 seconds to build instance.#033[00m
Jan 22 17:40:33 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:40:33 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dfc0edd52efd16d6a903ccf50ee47f3d7e9329bf86af0d08e5db1286bca3448/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.230 182627 DEBUG oslo_concurrency.lockutils [None req-f7f9ac7d-b793-432b-a6d5-1dd11c4edb46 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:33 np0005592767 podman[230095]: 2026-01-22 22:40:33.232046874 +0000 UTC m=+0.165443339 container init fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:40:33 np0005592767 podman[230095]: 2026-01-22 22:40:33.238459795 +0000 UTC m=+0.171856250 container start fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:40:33 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [NOTICE]   (230114) : New worker (230116) forked
Jan 22 17:40:33 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [NOTICE]   (230114) : Loading success.
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.816 182627 DEBUG nova.network.neutron [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updated VIF entry in instance network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.818 182627 DEBUG nova.network.neutron [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updating instance_info_cache with network_info: [{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.833 182627 DEBUG oslo_concurrency.lockutils [req-f640749e-c859-4426-8b6f-800e4c84f311 req-0c1505dd-6189-4ded-b361-31249af5b79e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:40:33 np0005592767 nova_compute[182623]: 2026-01-22 22:40:33.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:40:34 np0005592767 nova_compute[182623]: 2026-01-22 22:40:34.859 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:34.859 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:40:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:34.862 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.124 182627 DEBUG nova.compute.manager [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.125 182627 DEBUG oslo_concurrency.lockutils [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.126 182627 DEBUG oslo_concurrency.lockutils [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.127 182627 DEBUG oslo_concurrency.lockutils [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.127 182627 DEBUG nova.compute.manager [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] No waiting events found dispatching network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:40:35 np0005592767 nova_compute[182623]: 2026-01-22 22:40:35.128 182627 WARNING nova.compute.manager [req-05ad13a3-01d7-4832-b5f2-a10a9c9d55d2 req-16b274b9-097d-4838-bae8-acc5b8afc883 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received unexpected event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece for instance with vm_state active and task_state None.#033[00m
Jan 22 17:40:36 np0005592767 nova_compute[182623]: 2026-01-22 22:40:36.486 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:36 np0005592767 nova_compute[182623]: 2026-01-22 22:40:36.880 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:38 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:38.865 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:39 np0005592767 podman[230125]: 2026-01-22 22:40:39.185030384 +0000 UTC m=+0.090578327 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:40:39 np0005592767 podman[230126]: 2026-01-22 22:40:39.194948604 +0000 UTC m=+0.095514676 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:40:39 np0005592767 nova_compute[182623]: 2026-01-22 22:40:39.680 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:39 np0005592767 NetworkManager[54973]: <info>  [1769121639.6854] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 22 17:40:39 np0005592767 NetworkManager[54973]: <info>  [1769121639.6875] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 22 17:40:39 np0005592767 nova_compute[182623]: 2026-01-22 22:40:39.864 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:39Z|00497|binding|INFO|Releasing lport 3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61 from this chassis (sb_readonly=0)
Jan 22 17:40:39 np0005592767 nova_compute[182623]: 2026-01-22 22:40:39.884 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:40 np0005592767 nova_compute[182623]: 2026-01-22 22:40:40.172 182627 DEBUG nova.compute.manager [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:40:40 np0005592767 nova_compute[182623]: 2026-01-22 22:40:40.173 182627 DEBUG nova.compute.manager [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing instance network info cache due to event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:40:40 np0005592767 nova_compute[182623]: 2026-01-22 22:40:40.173 182627 DEBUG oslo_concurrency.lockutils [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:40:40 np0005592767 nova_compute[182623]: 2026-01-22 22:40:40.174 182627 DEBUG oslo_concurrency.lockutils [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:40:40 np0005592767 nova_compute[182623]: 2026-01-22 22:40:40.174 182627 DEBUG nova.network.neutron [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:40:41 np0005592767 nova_compute[182623]: 2026-01-22 22:40:41.488 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:41 np0005592767 nova_compute[182623]: 2026-01-22 22:40:41.935 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:42 np0005592767 nova_compute[182623]: 2026-01-22 22:40:42.305 182627 DEBUG nova.network.neutron [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updated VIF entry in instance network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:40:42 np0005592767 nova_compute[182623]: 2026-01-22 22:40:42.306 182627 DEBUG nova.network.neutron [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updating instance_info_cache with network_info: [{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:40:42 np0005592767 nova_compute[182623]: 2026-01-22 22:40:42.328 182627 DEBUG oslo_concurrency.lockutils [req-e664a8d6-f971-47e2-b465-f291b5495d92 req-3716a0ca-cfb5-4b12-b125-2daf4df00ad3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:40:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:46Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:40:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:46Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:40:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:46Z|00498|binding|INFO|Releasing lport 3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61 from this chassis (sb_readonly=0)
Jan 22 17:40:46 np0005592767 nova_compute[182623]: 2026-01-22 22:40:46.379 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:46 np0005592767 nova_compute[182623]: 2026-01-22 22:40:46.490 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:46 np0005592767 nova_compute[182623]: 2026-01-22 22:40:46.937 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:47 np0005592767 podman[230178]: 2026-01-22 22:40:47.163136854 +0000 UTC m=+0.068568046 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:40:51 np0005592767 nova_compute[182623]: 2026-01-22 22:40:51.494 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:51 np0005592767 nova_compute[182623]: 2026-01-22 22:40:51.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:52 np0005592767 nova_compute[182623]: 2026-01-22 22:40:52.020 182627 INFO nova.compute.manager [None req-e7b1d78a-d683-4e96-9291-86baf9d86040 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Get console output#033[00m
Jan 22 17:40:52 np0005592767 nova_compute[182623]: 2026-01-22 22:40:52.026 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:40:53 np0005592767 nova_compute[182623]: 2026-01-22 22:40:53.562 182627 INFO nova.compute.manager [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Rebuilding instance#033[00m
Jan 22 17:40:53 np0005592767 nova_compute[182623]: 2026-01-22 22:40:53.955 182627 DEBUG nova.compute.manager [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.038 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.054 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.067 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.079 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.089 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.093 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:40:54 np0005592767 nova_compute[182623]: 2026-01-22 22:40:54.967 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 kernel: tap4c730410-ee (unregistering): left promiscuous mode
Jan 22 17:40:56 np0005592767 NetworkManager[54973]: <info>  [1769121656.3458] device (tap4c730410-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:40:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:56Z|00499|binding|INFO|Releasing lport 4c730410-eed9-46ca-b9e0-5ba95117cece from this chassis (sb_readonly=0)
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.357 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:56Z|00500|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece down in Southbound
Jan 22 17:40:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:56Z|00501|binding|INFO|Removing iface tap4c730410-ee ovn-installed in OVS
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.360 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.366 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ce:2c 10.100.0.10'], port_security=['fa:16:3e:1e:ce:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3bede382-d3d0-4053-a98e-0add602d4f2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f0fd7437-d7e3-47f0-89a5-286033fb63ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f907ca0-d52f-4aef-b032-e68256e0ba9f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4c730410-eed9-46ca-b9e0-5ba95117cece) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.368 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4c730410-eed9-46ca-b9e0-5ba95117cece in datapath 09ec689a-3640-4bd8-88d1-a8c54c02874e unbound from our chassis#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.370 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09ec689a-3640-4bd8-88d1-a8c54c02874e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.371 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cef96086-1aaf-4b59-a49f-cfcc37464569]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.372 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e namespace which is not needed anymore#033[00m
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.386 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 22 17:40:56 np0005592767 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Consumed 13.432s CPU time.
Jan 22 17:40:56 np0005592767 systemd-machined[153912]: Machine qemu-65-instance-00000086 terminated.
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.496 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [NOTICE]   (230114) : haproxy version is 2.8.14-c23fe91
Jan 22 17:40:56 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [NOTICE]   (230114) : path to executable is /usr/sbin/haproxy
Jan 22 17:40:56 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [WARNING]  (230114) : Exiting Master process...
Jan 22 17:40:56 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [ALERT]    (230114) : Current worker (230116) exited with code 143 (Terminated)
Jan 22 17:40:56 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230110]: [WARNING]  (230114) : All workers exited. Exiting... (0)
Jan 22 17:40:56 np0005592767 systemd[1]: libpod-fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42.scope: Deactivated successfully.
Jan 22 17:40:56 np0005592767 podman[230228]: 2026-01-22 22:40:56.548564603 +0000 UTC m=+0.066771535 container died fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:40:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay-3dfc0edd52efd16d6a903ccf50ee47f3d7e9329bf86af0d08e5db1286bca3448-merged.mount: Deactivated successfully.
Jan 22 17:40:56 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42-userdata-shm.mount: Deactivated successfully.
Jan 22 17:40:56 np0005592767 podman[230228]: 2026-01-22 22:40:56.596170256 +0000 UTC m=+0.114377158 container cleanup fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.609 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 systemd[1]: libpod-conmon-fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42.scope: Deactivated successfully.
Jan 22 17:40:56 np0005592767 podman[230263]: 2026-01-22 22:40:56.701437927 +0000 UTC m=+0.064558343 container remove fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.710 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea4e24f-f42b-4c04-b319-ecf78cca73b6]: (4, ('Thu Jan 22 10:40:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e (fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42)\nfca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42\nThu Jan 22 10:40:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e (fca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42)\nfca34b75f264e56f3fccdb917dc0f9b645fbfe3a4cea9d6709bf05ec31ef9e42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.713 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e7511bff-6109-4110-bf8c-47a24b403553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.715 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09ec689a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.718 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 kernel: tap09ec689a-30: left promiscuous mode
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.747 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.751 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d7742991-10e6-4974-bc3f-f9608ae70bf4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.772 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[30d81489-ab61-468a-89ff-15622f89d85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e60cda25-9c28-491b-ba64-d32535b06007]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.797 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[201e8908-ebaf-4c72-aeb0-432ee4c1494d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522908, 'reachable_time': 24041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230290, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 systemd[1]: run-netns-ovnmeta\x2d09ec689a\x2d3640\x2d4bd8\x2d88d1\x2da8c54c02874e.mount: Deactivated successfully.
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.803 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:40:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:56.804 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f613d9-4921-4aa8-a84e-dc385b676289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:56 np0005592767 nova_compute[182623]: 2026-01-22 22:40:56.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.125 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance shutdown successfully after 3 seconds.#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.138 182627 INFO nova.virt.libvirt.driver [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance destroyed successfully.#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.144 182627 INFO nova.virt.libvirt.driver [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance destroyed successfully.#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.146 182627 DEBUG nova.virt.libvirt.vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:40:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:40:52Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.146 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.147 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.148 182627 DEBUG os_vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c730410-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.163 182627 INFO os_vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee')#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.163 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Deleting instance files /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f_del#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.165 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Deletion of /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f_del complete#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.595 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.596 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating image(s)#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.596 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.597 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.598 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.611 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.689 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.690 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.691 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.705 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.766 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.767 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.807 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c,backing_fmt=raw /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.808 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.809 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.870 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.873 182627 DEBUG nova.virt.disk.api [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.873 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.970 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.972 182627 DEBUG nova.virt.disk.api [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.973 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.974 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Ensure instance console log exists: /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.975 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.976 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.977 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.984 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start _get_guest_xml network_info=[{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:40:57 np0005592767 nova_compute[182623]: 2026-01-22 22:40:57.997 182627 WARNING nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.004 182627 DEBUG nova.virt.libvirt.host [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.005 182627 DEBUG nova.virt.libvirt.host [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.016 182627 DEBUG nova.virt.libvirt.host [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.020 182627 DEBUG nova.virt.libvirt.host [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.022 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.022 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:41Z,direct_url=<?>,disk_format='qcow2',id=8bcaf91e-26cd-4687-9abd-8185bd0c5241,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.023 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.023 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.023 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.023 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.023 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.024 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.024 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.024 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.024 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.025 182627 DEBUG nova.virt.hardware [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.025 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.049 182627 DEBUG nova.virt.libvirt.vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:40:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:40:57Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.050 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.052 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.055 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <uuid>3bede382-d3d0-4053-a98e-0add602d4f2f</uuid>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <name>instance-00000086</name>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-598973057</nova:name>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:40:58</nova:creationTime>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="8bcaf91e-26cd-4687-9abd-8185bd0c5241"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        <nova:port uuid="4c730410-eed9-46ca-b9e0-5ba95117cece">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="serial">3bede382-d3d0-4053-a98e-0add602d4f2f</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="uuid">3bede382-d3d0-4053-a98e-0add602d4f2f</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:1e:ce:2c"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <target dev="tap4c730410-ee"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/console.log" append="off"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:40:58 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:40:58 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:40:58 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:40:58 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.057 182627 DEBUG nova.virt.libvirt.vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:40:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:40:57Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.057 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.058 182627 DEBUG nova.network.os_vif_util [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.058 182627 DEBUG os_vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.059 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.060 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.060 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.064 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c730410-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.065 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c730410-ee, col_values=(('external_ids', {'iface-id': '4c730410-eed9-46ca-b9e0-5ba95117cece', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:ce:2c', 'vm-uuid': '3bede382-d3d0-4053-a98e-0add602d4f2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.066 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 NetworkManager[54973]: <info>  [1769121658.0684] manager: (tap4c730410-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.074 182627 INFO os_vif [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee')#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.144 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.145 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.145 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:1e:ce:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.147 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Using config drive#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.162 182627 DEBUG nova.compute.manager [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-unplugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.162 182627 DEBUG oslo_concurrency.lockutils [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.163 182627 DEBUG oslo_concurrency.lockutils [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.163 182627 DEBUG oslo_concurrency.lockutils [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.163 182627 DEBUG nova.compute.manager [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] No waiting events found dispatching network-vif-unplugged-4c730410-eed9-46ca-b9e0-5ba95117cece pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.164 182627 WARNING nova.compute.manager [req-9f1c8dcd-d721-43eb-b3f9-5c2c27cc995b req-3ad19384-8473-40b3-94aa-33650e3ea54c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received unexpected event network-vif-unplugged-4c730410-eed9-46ca-b9e0-5ba95117cece for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.165 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:58 np0005592767 podman[230306]: 2026-01-22 22:40:58.171139145 +0000 UTC m=+0.084838145 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.201 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'keypairs' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.522 182627 INFO nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Creating config drive at /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.531 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprnya8nrj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.683 182627 DEBUG oslo_concurrency.processutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprnya8nrj" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:40:58 np0005592767 kernel: tap4c730410-ee: entered promiscuous mode
Jan 22 17:40:58 np0005592767 systemd-udevd[230206]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:40:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:58Z|00502|binding|INFO|Claiming lport 4c730410-eed9-46ca-b9e0-5ba95117cece for this chassis.
Jan 22 17:40:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:58Z|00503|binding|INFO|4c730410-eed9-46ca-b9e0-5ba95117cece: Claiming fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:40:58 np0005592767 NetworkManager[54973]: <info>  [1769121658.7993] manager: (tap4c730410-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.799 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.809 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ce:2c 10.100.0.10'], port_security=['fa:16:3e:1e:ce:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3bede382-d3d0-4053-a98e-0add602d4f2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f0fd7437-d7e3-47f0-89a5-286033fb63ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f907ca0-d52f-4aef-b032-e68256e0ba9f, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4c730410-eed9-46ca-b9e0-5ba95117cece) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.811 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4c730410-eed9-46ca-b9e0-5ba95117cece in datapath 09ec689a-3640-4bd8-88d1-a8c54c02874e bound to our chassis#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.814 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09ec689a-3640-4bd8-88d1-a8c54c02874e#033[00m
Jan 22 17:40:58 np0005592767 NetworkManager[54973]: <info>  [1769121658.8205] device (tap4c730410-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:40:58 np0005592767 NetworkManager[54973]: <info>  [1769121658.8228] device (tap4c730410-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:40:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:58Z|00504|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece ovn-installed in OVS
Jan 22 17:40:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:58Z|00505|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece up in Southbound
Jan 22 17:40:58 np0005592767 nova_compute[182623]: 2026-01-22 22:40:58.829 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.838 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ce445677-6249-44ab-8bdd-7d1a46ec4bef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.839 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09ec689a-31 in ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.844 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09ec689a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.844 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[249b325d-f2d5-4dc5-96cc-4b03c628393e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.845 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[64f6dc9b-9291-4957-8951-4f0f688ba65c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 systemd-machined[153912]: New machine qemu-66-instance-00000086.
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.866 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5c3534-1777-4437-b4df-58171907113e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 systemd[1]: Started Virtual Machine qemu-66-instance-00000086.
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.901 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9be86d-3f4b-4e8b-a90d-d9ee6c615330]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.950 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[52bf09e6-a0be-4be1-961d-eb31187da943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:58.957 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3a72643b-0c55-4ce4-a90d-af7cebf931ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:58 np0005592767 NetworkManager[54973]: <info>  [1769121658.9594] manager: (tap09ec689a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.012 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0d9a0d-d19a-4bc1-8a5a-dfd0a6106ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.019 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[25472181-02d3-4f0b-9683-f73b03ed772e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 NetworkManager[54973]: <info>  [1769121659.0518] device (tap09ec689a-30): carrier: link connected
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.060 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cd31bb00-20a1-4803-926e-200ce795d64b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.088 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f64d10c9-e623-474c-9e7a-e232bb4a0644]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09ec689a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:4b:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525562, 'reachable_time': 28215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230378, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.116 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba4b4d7-a55f-445b-bb62-b576b0806c43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:4bf7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525562, 'tstamp': 525562}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230379, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[34a9a024-b162-4d1c-a3a5-a92f9ae36077]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09ec689a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:4b:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525562, 'reachable_time': 28215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230380, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.187 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[156ce0b0-bbaa-4bb6-b50e-b3d93853b7de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.292 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[97926ae3-d9d2-4336-b2a1-8bc0e1a00857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.294 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09ec689a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.295 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.296 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09ec689a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.298 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:59 np0005592767 NetworkManager[54973]: <info>  [1769121659.2996] manager: (tap09ec689a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 22 17:40:59 np0005592767 kernel: tap09ec689a-30: entered promiscuous mode
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.304 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.305 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09ec689a-30, col_values=(('external_ids', {'iface-id': '3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.307 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:40:59Z|00506|binding|INFO|Releasing lport 3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61 from this chassis (sb_readonly=0)
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.333 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.334 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.336 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3da04f-d687-4523-a377-72e6656581fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.337 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-09ec689a-3640-4bd8-88d1-a8c54c02874e
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/09ec689a-3640-4bd8-88d1-a8c54c02874e.pid.haproxy
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 09ec689a-3640-4bd8-88d1-a8c54c02874e
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:40:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:40:59.337 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'env', 'PROCESS_TAG=haproxy-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09ec689a-3640-4bd8-88d1-a8c54c02874e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.360 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 3bede382-d3d0-4053-a98e-0add602d4f2f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.361 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121659.3600717, 3bede382-d3d0-4053-a98e-0add602d4f2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.362 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.367 182627 DEBUG nova.compute.manager [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.368 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.376 182627 INFO nova.virt.libvirt.driver [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance spawned successfully.#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.376 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.399 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.411 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.418 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.419 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.420 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.421 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.421 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.422 182627 DEBUG nova.virt.libvirt.driver [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.440 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.440 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121659.3619845, 3bede382-d3d0-4053-a98e-0add602d4f2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.441 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Started (Lifecycle Event)#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.472 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.478 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.505 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.527 182627 DEBUG nova.compute.manager [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.642 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.643 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.643 182627 DEBUG nova.objects.instance [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 22 17:40:59 np0005592767 nova_compute[182623]: 2026-01-22 22:40:59.747 182627 DEBUG oslo_concurrency.lockutils [None req-9666d79b-fcbe-418b-808f-56465ae34195 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:40:59 np0005592767 podman[230418]: 2026-01-22 22:40:59.794103109 +0000 UTC m=+0.065657743 container create 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:40:59 np0005592767 systemd[1]: Started libpod-conmon-7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5.scope.
Jan 22 17:40:59 np0005592767 podman[230418]: 2026-01-22 22:40:59.761499149 +0000 UTC m=+0.033053823 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:40:59 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:40:59 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5762d684410ab05fca366b664a4e3a17f730d5b0dc4e0028e1335fbecaa7fe75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:40:59 np0005592767 podman[230418]: 2026-01-22 22:40:59.89477889 +0000 UTC m=+0.166333564 container init 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:40:59 np0005592767 podman[230418]: 2026-01-22 22:40:59.903845756 +0000 UTC m=+0.175400400 container start 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:40:59 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [NOTICE]   (230437) : New worker (230439) forked
Jan 22 17:40:59 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [NOTICE]   (230437) : Loading success.
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.271 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.272 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.273 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.273 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.274 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] No waiting events found dispatching network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.274 182627 WARNING nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received unexpected event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece for instance with vm_state active and task_state None.#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.274 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.275 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.275 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.275 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.276 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] No waiting events found dispatching network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.276 182627 WARNING nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received unexpected event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece for instance with vm_state active and task_state None.#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.277 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.277 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.278 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.278 182627 DEBUG oslo_concurrency.lockutils [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.278 182627 DEBUG nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] No waiting events found dispatching network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.280 182627 WARNING nova.compute.manager [req-043c24fb-2939-49d4-8b75-259cd9e24c2a req-7e02a615-f6d8-4070-a04c-6ef8c826ac28 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received unexpected event network-vif-plugged-4c730410-eed9-46ca-b9e0-5ba95117cece for instance with vm_state active and task_state None.#033[00m
Jan 22 17:41:00 np0005592767 nova_compute[182623]: 2026-01-22 22:41:00.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:01 np0005592767 nova_compute[182623]: 2026-01-22 22:41:01.971 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:03 np0005592767 nova_compute[182623]: 2026-01-22 22:41:03.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:03 np0005592767 podman[230449]: 2026-01-22 22:41:03.182398973 +0000 UTC m=+0.091242155 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:41:03 np0005592767 podman[230448]: 2026-01-22 22:41:03.198465466 +0000 UTC m=+0.109257383 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:41:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:06Z|00507|binding|INFO|Releasing lport 3fd80f7c-0e7e-4f3b-a6a6-5fbdf4667c61 from this chassis (sb_readonly=0)
Jan 22 17:41:06 np0005592767 nova_compute[182623]: 2026-01-22 22:41:06.128 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:06 np0005592767 nova_compute[182623]: 2026-01-22 22:41:06.974 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:08 np0005592767 nova_compute[182623]: 2026-01-22 22:41:08.070 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:10 np0005592767 podman[230494]: 2026-01-22 22:41:10.134256327 +0000 UTC m=+0.051237407 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:41:10 np0005592767 podman[230493]: 2026-01-22 22:41:10.13438177 +0000 UTC m=+0.052924434 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 17:41:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:11Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:41:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:11Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:ce:2c 10.100.0.10
Jan 22 17:41:11 np0005592767 nova_compute[182623]: 2026-01-22 22:41:11.977 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:12.113 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:12.114 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:13 np0005592767 nova_compute[182623]: 2026-01-22 22:41:13.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:13 np0005592767 nova_compute[182623]: 2026-01-22 22:41:13.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:16 np0005592767 nova_compute[182623]: 2026-01-22 22:41:16.979 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:17 np0005592767 nova_compute[182623]: 2026-01-22 22:41:17.933 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:18 np0005592767 nova_compute[182623]: 2026-01-22 22:41:18.075 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:18 np0005592767 podman[230548]: 2026-01-22 22:41:18.136087156 +0000 UTC m=+0.052930784 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:41:18 np0005592767 nova_compute[182623]: 2026-01-22 22:41:18.255 182627 INFO nova.compute.manager [None req-e046699e-e782-4849-be79-028c352d9ba7 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Get console output#033[00m
Jan 22 17:41:18 np0005592767 nova_compute[182623]: 2026-01-22 22:41:18.261 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.300 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.301 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.302 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.302 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.302 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.319 182627 INFO nova.compute.manager [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Terminating instance#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.332 182627 DEBUG nova.compute.manager [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:41:19 np0005592767 kernel: tap4c730410-ee (unregistering): left promiscuous mode
Jan 22 17:41:19 np0005592767 NetworkManager[54973]: <info>  [1769121679.3637] device (tap4c730410-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:41:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:19Z|00508|binding|INFO|Releasing lport 4c730410-eed9-46ca-b9e0-5ba95117cece from this chassis (sb_readonly=0)
Jan 22 17:41:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:19Z|00509|binding|INFO|Setting lport 4c730410-eed9-46ca-b9e0-5ba95117cece down in Southbound
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.376 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:19Z|00510|binding|INFO|Removing iface tap4c730410-ee ovn-installed in OVS
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.379 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.392 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ce:2c 10.100.0.10'], port_security=['fa:16:3e:1e:ce:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3bede382-d3d0-4053-a98e-0add602d4f2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f0fd7437-d7e3-47f0-89a5-286033fb63ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f907ca0-d52f-4aef-b032-e68256e0ba9f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4c730410-eed9-46ca-b9e0-5ba95117cece) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.393 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4c730410-eed9-46ca-b9e0-5ba95117cece in datapath 09ec689a-3640-4bd8-88d1-a8c54c02874e unbound from our chassis#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.396 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09ec689a-3640-4bd8-88d1-a8c54c02874e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.396 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.398 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a8429b7d-9b0b-40d5-b872-8fe1a6d5b13a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.400 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e namespace which is not needed anymore#033[00m
Jan 22 17:41:19 np0005592767 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 22 17:41:19 np0005592767 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000086.scope: Consumed 13.307s CPU time.
Jan 22 17:41:19 np0005592767 systemd-machined[153912]: Machine qemu-66-instance-00000086 terminated.
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [NOTICE]   (230437) : haproxy version is 2.8.14-c23fe91
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [NOTICE]   (230437) : path to executable is /usr/sbin/haproxy
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [WARNING]  (230437) : Exiting Master process...
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [WARNING]  (230437) : Exiting Master process...
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [ALERT]    (230437) : Current worker (230439) exited with code 143 (Terminated)
Jan 22 17:41:19 np0005592767 neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e[230433]: [WARNING]  (230437) : All workers exited. Exiting... (0)
Jan 22 17:41:19 np0005592767 systemd[1]: libpod-7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5.scope: Deactivated successfully.
Jan 22 17:41:19 np0005592767 podman[230597]: 2026-01-22 22:41:19.545799222 +0000 UTC m=+0.048178961 container died 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.552 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.559 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5-userdata-shm.mount: Deactivated successfully.
Jan 22 17:41:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay-5762d684410ab05fca366b664a4e3a17f730d5b0dc4e0028e1335fbecaa7fe75-merged.mount: Deactivated successfully.
Jan 22 17:41:19 np0005592767 podman[230597]: 2026-01-22 22:41:19.595478553 +0000 UTC m=+0.097858282 container cleanup 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.605 182627 INFO nova.virt.libvirt.driver [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Instance destroyed successfully.#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.607 182627 DEBUG nova.objects.instance [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 3bede382-d3d0-4053-a98e-0add602d4f2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:19 np0005592767 systemd[1]: libpod-conmon-7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5.scope: Deactivated successfully.
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.635 182627 DEBUG nova.virt.libvirt.vif [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:40:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-598973057',display_name='tempest-TestNetworkAdvancedServerOps-server-598973057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-598973057',id=134,image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODE6+SJJ8btY9W/mynN3qwbr5EVg7IaT7YrLyKKAYTxeOYe9MstU5bquIsyG2Q89VVqa7qu3wgDRxKifX86BN44B+A089z3VEkmm7pSVaOJes6RiePr56lhtjWcxilonw==',key_name='tempest-TestNetworkAdvancedServerOps-1883230350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:40:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-cvjrm05k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='8bcaf91e-26cd-4687-9abd-8185bd0c5241',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:40:59Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=3bede382-d3d0-4053-a98e-0add602d4f2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.635 182627 DEBUG nova.network.os_vif_util [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.636 182627 DEBUG nova.network.os_vif_util [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.637 182627 DEBUG os_vif [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.639 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.639 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c730410-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.640 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.645 182627 INFO os_vif [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:ce:2c,bridge_name='br-int',has_traffic_filtering=True,id=4c730410-eed9-46ca-b9e0-5ba95117cece,network=Network(09ec689a-3640-4bd8-88d1-a8c54c02874e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c730410-ee')#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.645 182627 INFO nova.virt.libvirt.driver [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Deleting instance files /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f_del#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.646 182627 INFO nova.virt.libvirt.driver [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Deletion of /var/lib/nova/instances/3bede382-d3d0-4053-a98e-0add602d4f2f_del complete#033[00m
Jan 22 17:41:19 np0005592767 podman[230640]: 2026-01-22 22:41:19.6655363 +0000 UTC m=+0.045155205 container remove 7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.671 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[59b15200-1ace-4fa0-9344-976c1e3c475c]: (4, ('Thu Jan 22 10:41:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e (7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5)\n7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5\nThu Jan 22 10:41:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e (7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5)\n7c29b810cde2dcbbbc937d0483dcfb78e9c87370b5166360b508e9eb33e479e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.673 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[630d627a-33e7-44b8-b9f7-4025c2ddb90a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.675 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09ec689a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.678 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 kernel: tap09ec689a-30: left promiscuous mode
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.686 182627 DEBUG nova.compute.manager [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.686 182627 DEBUG nova.compute.manager [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing instance network info cache due to event network-changed-4c730410-eed9-46ca-b9e0-5ba95117cece. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.687 182627 DEBUG oslo_concurrency.lockutils [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.687 182627 DEBUG oslo_concurrency.lockutils [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.687 182627 DEBUG nova.network.neutron [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Refreshing network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.702 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9d68dc-240e-4132-952d-5768f639242c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.722 182627 INFO nova.compute.manager [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.723 182627 DEBUG oslo.service.loopingcall [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.723 182627 DEBUG nova.compute.manager [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:41:19 np0005592767 nova_compute[182623]: 2026-01-22 22:41:19.723 182627 DEBUG nova.network.neutron [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.730 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9b0122ad-ac64-417a-a150-a7547c8a7c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.732 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[62258436-e393-47e0-86e4-c9d675329316]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.752 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d81450-ce41-4a6c-b3fc-dfc4d0398fcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525551, 'reachable_time': 29956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230654, 'error': None, 'target': 'ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:19 np0005592767 systemd[1]: run-netns-ovnmeta\x2d09ec689a\x2d3640\x2d4bd8\x2d88d1\x2da8c54c02874e.mount: Deactivated successfully.
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.758 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09ec689a-3640-4bd8-88d1-a8c54c02874e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:41:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:19.759 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bfacff59-e97d-4dda-83fe-a475b73c33de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.400 182627 DEBUG nova.network.neutron [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.419 182627 INFO nova.compute.manager [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Took 0.70 seconds to deallocate network for instance.#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.499 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.499 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.514 182627 DEBUG nova.compute.manager [req-83bb7725-c0a8-4fb5-bd21-b305a23ceacb req-da0585da-7ea5-47ec-8bb3-5125a27b2d7e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Received event network-vif-deleted-4c730410-eed9-46ca-b9e0-5ba95117cece external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.530 182627 DEBUG nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.549 182627 DEBUG nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.550 182627 DEBUG nova.compute.provider_tree [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.567 182627 DEBUG nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.601 182627 DEBUG nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.642 182627 DEBUG nova.compute.provider_tree [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.663 182627 DEBUG nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.689 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.715 182627 INFO nova.scheduler.client.report [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance 3bede382-d3d0-4053-a98e-0add602d4f2f#033[00m
Jan 22 17:41:20 np0005592767 nova_compute[182623]: 2026-01-22 22:41:20.851 182627 DEBUG oslo_concurrency.lockutils [None req-583d6da4-4f9f-47fa-af87-9b32dc6f234c 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "3bede382-d3d0-4053-a98e-0add602d4f2f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.023 182627 DEBUG nova.network.neutron [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updated VIF entry in instance network info cache for port 4c730410-eed9-46ca-b9e0-5ba95117cece. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.024 182627 DEBUG nova.network.neutron [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Updating instance_info_cache with network_info: [{"id": "4c730410-eed9-46ca-b9e0-5ba95117cece", "address": "fa:16:3e:1e:ce:2c", "network": {"id": "09ec689a-3640-4bd8-88d1-a8c54c02874e", "bridge": "br-int", "label": "tempest-network-smoke--109516624", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c730410-ee", "ovs_interfaceid": "4c730410-eed9-46ca-b9e0-5ba95117cece", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.043 182627 DEBUG oslo_concurrency.lockutils [req-9bfc5c66-32bd-4eca-9bf1-1b8310e590ae req-e9efe5e1-4c7f-4405-894f-fd7011f9bfce 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-3bede382-d3d0-4053-a98e-0add602d4f2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.331 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.332 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.349 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.437 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.437 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.446 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.447 182627 INFO nova.compute.claims [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.578 182627 DEBUG nova.compute.provider_tree [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.593 182627 DEBUG nova.scheduler.client.report [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.621 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.622 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.673 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.673 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.688 182627 INFO nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.713 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.873 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.876 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.876 182627 INFO nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Creating image(s)#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.878 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.878 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.881 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.906 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.975 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.977 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:21 np0005592767 nova_compute[182623]: 2026-01-22 22:41:21.978 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.002 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.031 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.079 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.080 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.123 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.125 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.126 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.189 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.192 182627 DEBUG nova.virt.disk.api [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Checking if we can resize image /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.193 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.295 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.297 182627 DEBUG nova.virt.disk.api [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Cannot resize image /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.297 182627 DEBUG nova.objects.instance [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lazy-loading 'migration_context' on Instance uuid c93b2196-c404-45e0-93af-3dc1e7f48c5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.311 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.312 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Ensure instance console log exists: /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.313 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.313 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.314 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:22 np0005592767 nova_compute[182623]: 2026-01-22 22:41:22.628 182627 DEBUG nova.policy [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fc0ad46b71ee49809728db8b218a1bbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e0007cc6ba2c4c2abd654f4226e30456', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:41:23 np0005592767 nova_compute[182623]: 2026-01-22 22:41:23.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.642 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.940 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:41:24 np0005592767 nova_compute[182623]: 2026-01-22 22:41:24.941 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:41:25 np0005592767 nova_compute[182623]: 2026-01-22 22:41:25.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:25 np0005592767 nova_compute[182623]: 2026-01-22 22:41:25.999 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Successfully created port: d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.917 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:41:26 np0005592767 nova_compute[182623]: 2026-01-22 22:41:26.982 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.098 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.099 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5647MB free_disk=73.18858337402344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.099 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.099 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.180 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance c93b2196-c404-45e0-93af-3dc1e7f48c5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.180 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.181 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.235 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.265 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.300 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.301 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.534 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:27 np0005592767 nova_compute[182623]: 2026-01-22 22:41:27.753 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.301 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.302 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.562 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Successfully updated port: d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.583 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.583 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquired lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.584 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:41:28 np0005592767 nova_compute[182623]: 2026-01-22 22:41:28.808 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:41:29 np0005592767 podman[230673]: 2026-01-22 22:41:29.174716293 +0000 UTC m=+0.082138789 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.637 182627 DEBUG nova.network.neutron [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Updating instance_info_cache with network_info: [{"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.644 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.667 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Releasing lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.668 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Instance network_info: |[{"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.672 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Start _get_guest_xml network_info=[{"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.680 182627 WARNING nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.684 182627 DEBUG nova.virt.libvirt.host [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.685 182627 DEBUG nova.virt.libvirt.host [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.689 182627 DEBUG nova.virt.libvirt.host [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.690 182627 DEBUG nova.virt.libvirt.host [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.692 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.693 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.694 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.694 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.695 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.695 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.696 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.696 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.697 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.697 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.698 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.698 182627 DEBUG nova.virt.hardware [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.705 182627 DEBUG nova.virt.libvirt.vif [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-699455060',display_name='tempest-ServerAddressesNegativeTestJSON-server-699455060',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-699455060',id=136,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0007cc6ba2c4c2abd654f4226e30456',ramdisk_id='',reservation_id='r-ul5caoao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1506497326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1506497326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:21Z,user_data=None,user_id='fc0ad46b71ee49809728db8b218a1bbd',uuid=c93b2196-c404-45e0-93af-3dc1e7f48c5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.706 182627 DEBUG nova.network.os_vif_util [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converting VIF {"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.707 182627 DEBUG nova.network.os_vif_util [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.709 182627 DEBUG nova.objects.instance [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lazy-loading 'pci_devices' on Instance uuid c93b2196-c404-45e0-93af-3dc1e7f48c5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.733 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <uuid>c93b2196-c404-45e0-93af-3dc1e7f48c5f</uuid>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <name>instance-00000088</name>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-699455060</nova:name>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:41:29</nova:creationTime>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:user uuid="fc0ad46b71ee49809728db8b218a1bbd">tempest-ServerAddressesNegativeTestJSON-1506497326-project-member</nova:user>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:project uuid="e0007cc6ba2c4c2abd654f4226e30456">tempest-ServerAddressesNegativeTestJSON-1506497326</nova:project>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        <nova:port uuid="d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="serial">c93b2196-c404-45e0-93af-3dc1e7f48c5f</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="uuid">c93b2196-c404-45e0-93af-3dc1e7f48c5f</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.config"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:fc:c5:e0"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <target dev="tapd7e4ecf2-4b"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/console.log" append="off"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:41:29 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:41:29 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:41:29 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:41:29 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.735 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Preparing to wait for external event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.736 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.736 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.736 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.737 182627 DEBUG nova.virt.libvirt.vif [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-699455060',display_name='tempest-ServerAddressesNegativeTestJSON-server-699455060',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-699455060',id=136,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0007cc6ba2c4c2abd654f4226e30456',ramdisk_id='',reservation_id='r-ul5caoao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1506497326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1506497326-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:21Z,user_data=None,user_id='fc0ad46b71ee49809728db8b218a1bbd',uuid=c93b2196-c404-45e0-93af-3dc1e7f48c5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.737 182627 DEBUG nova.network.os_vif_util [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converting VIF {"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.737 182627 DEBUG nova.network.os_vif_util [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.738 182627 DEBUG os_vif [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.738 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.738 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.739 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.741 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.741 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7e4ecf2-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.742 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7e4ecf2-4b, col_values=(('external_ids', {'iface-id': 'd7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:c5:e0', 'vm-uuid': 'c93b2196-c404-45e0-93af-3dc1e7f48c5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.780 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:29 np0005592767 NetworkManager[54973]: <info>  [1769121689.7815] manager: (tapd7e4ecf2-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.790 182627 INFO os_vif [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b')#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.862 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.863 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.863 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] No VIF found with MAC fa:16:3e:fc:c5:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.864 182627 INFO nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Using config drive#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:29 np0005592767 nova_compute[182623]: 2026-01-22 22:41:29.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.113 182627 DEBUG nova.compute.manager [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-changed-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.113 182627 DEBUG nova.compute.manager [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Refreshing instance network info cache due to event network-changed-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.114 182627 DEBUG oslo_concurrency.lockutils [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.114 182627 DEBUG oslo_concurrency.lockutils [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.114 182627 DEBUG nova.network.neutron [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Refreshing network info cache for port d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.925 182627 INFO nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Creating config drive at /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.config#033[00m
Jan 22 17:41:30 np0005592767 nova_compute[182623]: 2026-01-22 22:41:30.934 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqnkbd7my execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.070 182627 DEBUG oslo_concurrency.processutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqnkbd7my" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:31 np0005592767 kernel: tapd7e4ecf2-4b: entered promiscuous mode
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.1382] manager: (tapd7e4ecf2-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 22 17:41:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:31Z|00511|binding|INFO|Claiming lport d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe for this chassis.
Jan 22 17:41:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:31Z|00512|binding|INFO|d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe: Claiming fa:16:3e:fc:c5:e0 10.100.0.10
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 systemd-udevd[230713]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.171 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:c5:e0 10.100.0.10'], port_security=['fa:16:3e:fc:c5:e0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c93b2196-c404-45e0-93af-3dc1e7f48c5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0007cc6ba2c4c2abd654f4226e30456', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24362a1d-6f65-40a7-9f9f-3ea85e430d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16d15a91-8dbb-4714-a558-6d332aa817b2, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.173 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe in datapath 544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 bound to our chassis#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.176 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 544805a8-5e0b-4bae-8cfd-7b8c88fe58a2#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.194 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9dddd7-e79b-4870-bfc2-366b1e9726c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.195 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap544805a8-51 in ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:41:31 np0005592767 systemd-machined[153912]: New machine qemu-67-instance-00000088.
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.1972] device (tapd7e4ecf2-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.1980] device (tapd7e4ecf2-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.201 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap544805a8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.201 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[193087bb-3b0b-4cff-9f4e-1c6234ca4dc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.202 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[77dd75ad-5132-4e30-bf62-859721fc423a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 systemd[1]: Started Virtual Machine qemu-67-instance-00000088.
Jan 22 17:41:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:31Z|00513|binding|INFO|Setting lport d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe ovn-installed in OVS
Jan 22 17:41:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:31Z|00514|binding|INFO|Setting lport d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe up in Southbound
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.216 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.223 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[cec9eb99-618c-47f2-95ae-2e7557fdb962]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.240 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a234bc2-b7f5-4a1f-93e3-f64ce42443e0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.276 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fed715e8-bc7f-4666-ac89-47bb5d35887a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.286 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[805d8a39-13c6-40e9-80d6-64ec897b88c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.2879] manager: (tap544805a8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.328 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6b7fdd-bff0-4b2f-a8db-c50b670fbd43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.330 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7a87e6-3d80-464c-9037-6e4236dc597d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.3579] device (tap544805a8-50): carrier: link connected
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.364 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3307ee40-90bf-4c13-af44-ab634965b098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.386 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d6afb9-326c-4793-ba04-a5db606d9d32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap544805a8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:d4:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528793, 'reachable_time': 40431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230747, 'error': None, 'target': 'ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.409 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5c1f15-609f-4b7f-840a-5c9277f8e4a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:d4fa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528793, 'tstamp': 528793}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230748, 'error': None, 'target': 'ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.431 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[22de511b-f9d8-43ca-9598-05142cb981c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap544805a8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:d4:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528793, 'reachable_time': 40431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230749, 'error': None, 'target': 'ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.482 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5138b6-fd00-4581-ac02-e92f97d7d5f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.581 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e0b6131-d1e6-4b75-8c87-0353879b7ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.583 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap544805a8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.584 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.585 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap544805a8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.587 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 NetworkManager[54973]: <info>  [1769121691.5886] manager: (tap544805a8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 22 17:41:31 np0005592767 kernel: tap544805a8-50: entered promiscuous mode
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.591 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.593 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap544805a8-50, col_values=(('external_ids', {'iface-id': 'd02591e1-2a8b-41a2-920e-842684cdeec7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.594 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:31Z|00515|binding|INFO|Releasing lport d02591e1-2a8b-41a2-920e-842684cdeec7 from this chassis (sb_readonly=0)
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.618 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.620 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/544805a8-5e0b-4bae-8cfd-7b8c88fe58a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/544805a8-5e0b-4bae-8cfd-7b8c88fe58a2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.621 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[054804fd-da3b-4968-a6e7-453a1adddd35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.622 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/544805a8-5e0b-4bae-8cfd-7b8c88fe58a2.pid.haproxy
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 544805a8-5e0b-4bae-8cfd-7b8c88fe58a2
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:41:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:31.623 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'env', 'PROCESS_TAG=haproxy-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/544805a8-5e0b-4bae-8cfd-7b8c88fe58a2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.717 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121691.717459, c93b2196-c404-45e0-93af-3dc1e7f48c5f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.718 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] VM Started (Lifecycle Event)#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.747 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.752 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121691.718401, c93b2196-c404-45e0-93af-3dc1e7f48c5f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.752 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.775 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.778 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.801 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.813 182627 DEBUG nova.compute.manager [req-918f61a3-a0f7-4e30-b0c2-f9f66bb259de req-ee141c34-11c6-4b01-93c7-4f9d2fbdf8eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.813 182627 DEBUG oslo_concurrency.lockutils [req-918f61a3-a0f7-4e30-b0c2-f9f66bb259de req-ee141c34-11c6-4b01-93c7-4f9d2fbdf8eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.813 182627 DEBUG oslo_concurrency.lockutils [req-918f61a3-a0f7-4e30-b0c2-f9f66bb259de req-ee141c34-11c6-4b01-93c7-4f9d2fbdf8eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.814 182627 DEBUG oslo_concurrency.lockutils [req-918f61a3-a0f7-4e30-b0c2-f9f66bb259de req-ee141c34-11c6-4b01-93c7-4f9d2fbdf8eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.814 182627 DEBUG nova.compute.manager [req-918f61a3-a0f7-4e30-b0c2-f9f66bb259de req-ee141c34-11c6-4b01-93c7-4f9d2fbdf8eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Processing event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.815 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.819 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121691.81922, c93b2196-c404-45e0-93af-3dc1e7f48c5f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.820 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.822 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.825 182627 INFO nova.virt.libvirt.driver [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Instance spawned successfully.#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.825 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.846 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.854 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.859 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.860 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.860 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.861 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.862 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.862 182627 DEBUG nova.virt.libvirt.driver [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.877 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.930 182627 INFO nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Took 10.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.932 182627 DEBUG nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:31 np0005592767 nova_compute[182623]: 2026-01-22 22:41:31.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:32 np0005592767 nova_compute[182623]: 2026-01-22 22:41:32.053 182627 INFO nova.compute.manager [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Took 10.65 seconds to build instance.#033[00m
Jan 22 17:41:32 np0005592767 nova_compute[182623]: 2026-01-22 22:41:32.075 182627 DEBUG oslo_concurrency.lockutils [None req-92935356-7cd3-40af-be21-116eaf9554da fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:32 np0005592767 podman[230788]: 2026-01-22 22:41:32.121589101 +0000 UTC m=+0.073078593 container create cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:41:32 np0005592767 systemd[1]: Started libpod-conmon-cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b.scope.
Jan 22 17:41:32 np0005592767 podman[230788]: 2026-01-22 22:41:32.078279299 +0000 UTC m=+0.029768881 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:41:32 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:41:32 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a2e4536f353c193b69eec0a01b7a783a9371ed162a7f918f7cfd9bb115e9308/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:41:32 np0005592767 podman[230788]: 2026-01-22 22:41:32.209007878 +0000 UTC m=+0.160497400 container init cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:41:32 np0005592767 podman[230788]: 2026-01-22 22:41:32.213955678 +0000 UTC m=+0.165445170 container start cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:41:32 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [NOTICE]   (230807) : New worker (230809) forked
Jan 22 17:41:32 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [NOTICE]   (230807) : Loading success.
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.327 182627 DEBUG nova.network.neutron [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Updated VIF entry in instance network info cache for port d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.328 182627 DEBUG nova.network.neutron [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Updating instance_info_cache with network_info: [{"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.342 182627 DEBUG oslo_concurrency.lockutils [req-71bab513-9004-44c8-a6a9-7126e1a27d41 req-89aeea42-124b-4193-a9d9-77294253c7e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c93b2196-c404-45e0-93af-3dc1e7f48c5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.926 182627 DEBUG nova.compute.manager [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.927 182627 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.927 182627 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.928 182627 DEBUG oslo_concurrency.lockutils [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.929 182627 DEBUG nova.compute.manager [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] No waiting events found dispatching network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:33 np0005592767 nova_compute[182623]: 2026-01-22 22:41:33.930 182627 WARNING nova.compute.manager [req-fc144d8a-7f7a-4c9c-8543-b39042ec1bb8 req-fe6321aa-bcb7-4f8a-aa45-8ec533c32c02 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received unexpected event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe for instance with vm_state active and task_state None.#033[00m
Jan 22 17:41:34 np0005592767 podman[230819]: 2026-01-22 22:41:34.194403307 +0000 UTC m=+0.092003537 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 22 17:41:34 np0005592767 podman[230818]: 2026-01-22 22:41:34.242611698 +0000 UTC m=+0.136046690 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:41:34 np0005592767 nova_compute[182623]: 2026-01-22 22:41:34.604 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121679.603284, 3bede382-d3d0-4053-a98e-0add602d4f2f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:34 np0005592767 nova_compute[182623]: 2026-01-22 22:41:34.605 182627 INFO nova.compute.manager [-] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:41:34 np0005592767 nova_compute[182623]: 2026-01-22 22:41:34.640 182627 DEBUG nova.compute.manager [None req-8919d6b9-49ba-446e-bb37-5120456041e7 - - - - - -] [instance: 3bede382-d3d0-4053-a98e-0add602d4f2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:34 np0005592767 nova_compute[182623]: 2026-01-22 22:41:34.826 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:34 np0005592767 nova_compute[182623]: 2026-01-22 22:41:34.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.420 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.420 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.420 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.421 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.421 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.434 182627 INFO nova.compute.manager [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Terminating instance#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.445 182627 DEBUG nova.compute.manager [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:41:35 np0005592767 kernel: tapd7e4ecf2-4b (unregistering): left promiscuous mode
Jan 22 17:41:35 np0005592767 NetworkManager[54973]: <info>  [1769121695.4696] device (tapd7e4ecf2-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.480 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:35Z|00516|binding|INFO|Releasing lport d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe from this chassis (sb_readonly=0)
Jan 22 17:41:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:35Z|00517|binding|INFO|Setting lport d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe down in Southbound
Jan 22 17:41:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:35Z|00518|binding|INFO|Removing iface tapd7e4ecf2-4b ovn-installed in OVS
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.489 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:c5:e0 10.100.0.10'], port_security=['fa:16:3e:fc:c5:e0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c93b2196-c404-45e0-93af-3dc1e7f48c5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0007cc6ba2c4c2abd654f4226e30456', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24362a1d-6f65-40a7-9f9f-3ea85e430d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16d15a91-8dbb-4714-a558-6d332aa817b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.491 104135 INFO neutron.agent.ovn.metadata.agent [-] Port d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe in datapath 544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 unbound from our chassis#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.493 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.494 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4999c3-bbe1-4b2c-b843-b46a8da4befd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.497 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 namespace which is not needed anymore#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.508 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000088.scope: Deactivated successfully.
Jan 22 17:41:35 np0005592767 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000088.scope: Consumed 4.190s CPU time.
Jan 22 17:41:35 np0005592767 systemd-machined[153912]: Machine qemu-67-instance-00000088 terminated.
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.670 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.678 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [NOTICE]   (230807) : haproxy version is 2.8.14-c23fe91
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [NOTICE]   (230807) : path to executable is /usr/sbin/haproxy
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [WARNING]  (230807) : Exiting Master process...
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [WARNING]  (230807) : Exiting Master process...
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [ALERT]    (230807) : Current worker (230809) exited with code 143 (Terminated)
Jan 22 17:41:35 np0005592767 neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2[230803]: [WARNING]  (230807) : All workers exited. Exiting... (0)
Jan 22 17:41:35 np0005592767 systemd[1]: libpod-cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b.scope: Deactivated successfully.
Jan 22 17:41:35 np0005592767 podman[230887]: 2026-01-22 22:41:35.694144354 +0000 UTC m=+0.064857331 container died cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.731 182627 INFO nova.virt.libvirt.driver [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Instance destroyed successfully.#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.732 182627 DEBUG nova.objects.instance [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lazy-loading 'resources' on Instance uuid c93b2196-c404-45e0-93af-3dc1e7f48c5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b-userdata-shm.mount: Deactivated successfully.
Jan 22 17:41:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay-6a2e4536f353c193b69eec0a01b7a783a9371ed162a7f918f7cfd9bb115e9308-merged.mount: Deactivated successfully.
Jan 22 17:41:35 np0005592767 podman[230887]: 2026-01-22 22:41:35.749552718 +0000 UTC m=+0.120265695 container cleanup cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.753 182627 DEBUG nova.virt.libvirt.vif [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-699455060',display_name='tempest-ServerAddressesNegativeTestJSON-server-699455060',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-699455060',id=136,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e0007cc6ba2c4c2abd654f4226e30456',ramdisk_id='',reservation_id='r-ul5caoao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1506497326',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1506497326-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:41:31Z,user_data=None,user_id='fc0ad46b71ee49809728db8b218a1bbd',uuid=c93b2196-c404-45e0-93af-3dc1e7f48c5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.754 182627 DEBUG nova.network.os_vif_util [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converting VIF {"id": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "address": "fa:16:3e:fc:c5:e0", "network": {"id": "544805a8-5e0b-4bae-8cfd-7b8c88fe58a2", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-499921560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e0007cc6ba2c4c2abd654f4226e30456", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7e4ecf2-4b", "ovs_interfaceid": "d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.755 182627 DEBUG nova.network.os_vif_util [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.755 182627 DEBUG os_vif [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.758 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7e4ecf2-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.759 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 systemd[1]: libpod-conmon-cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b.scope: Deactivated successfully.
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.763 182627 INFO os_vif [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:c5:e0,bridge_name='br-int',has_traffic_filtering=True,id=d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe,network=Network(544805a8-5e0b-4bae-8cfd-7b8c88fe58a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7e4ecf2-4b')#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.763 182627 INFO nova.virt.libvirt.driver [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Deleting instance files /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f_del#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.764 182627 INFO nova.virt.libvirt.driver [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Deletion of /var/lib/nova/instances/c93b2196-c404-45e0-93af-3dc1e7f48c5f_del complete#033[00m
Jan 22 17:41:35 np0005592767 podman[230932]: 2026-01-22 22:41:35.834480724 +0000 UTC m=+0.046764770 container remove cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.840 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cba98002-9fa5-4f05-93b6-2e5fc2aa38cf]: (4, ('Thu Jan 22 10:41:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 (cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b)\ncbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b\nThu Jan 22 10:41:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 (cbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b)\ncbd193f3108df0ea36948fcff5341c7ce00550d0247a4683fcef3d9212a5a86b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.842 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75c5a141-b4eb-4ef8-96ce-f41a9ea948b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.845 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap544805a8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.892 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 kernel: tap544805a8-50: left promiscuous mode
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.901 182627 INFO nova.compute.manager [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.902 182627 DEBUG oslo.service.loopingcall [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.903 182627 DEBUG nova.compute.manager [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.903 182627 DEBUG nova.network.neutron [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:41:35 np0005592767 nova_compute[182623]: 2026-01-22 22:41:35.905 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.905 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7549a7-d070-41f8-83ae-cc94e24221b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.926 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9aab0593-8131-4a6f-836a-5db4931352f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.928 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9634c368-4480-4cae-9613-e09e0afdd6d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.952 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[37c06a67-c076-4f98-b42a-7e609c575802]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528785, 'reachable_time': 34571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230946, 'error': None, 'target': 'ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:35 np0005592767 systemd[1]: run-netns-ovnmeta\x2d544805a8\x2d5e0b\x2d4bae\x2d8cfd\x2d7b8c88fe58a2.mount: Deactivated successfully.
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.957 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-544805a8-5e0b-4bae-8cfd-7b8c88fe58a2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:41:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:35.957 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[4362ce3c-16a7-435b-b50f-585bcfefae7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.125 182627 DEBUG nova.compute.manager [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-unplugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.125 182627 DEBUG oslo_concurrency.lockutils [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.125 182627 DEBUG oslo_concurrency.lockutils [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.126 182627 DEBUG oslo_concurrency.lockutils [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.126 182627 DEBUG nova.compute.manager [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] No waiting events found dispatching network-vif-unplugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.126 182627 DEBUG nova.compute.manager [req-e99d93f9-a788-40c9-901a-76a4e0e44789 req-ce92acac-efc4-4aee-a3d9-385e863623c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-unplugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:41:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:36.485 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:41:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:36.485 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.488 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.713 182627 DEBUG nova.network.neutron [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.746 182627 INFO nova.compute.manager [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Took 0.84 seconds to deallocate network for instance.#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.849 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.850 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.906 182627 DEBUG nova.compute.provider_tree [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.926 182627 DEBUG nova.scheduler.client.report [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.948 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:36 np0005592767 nova_compute[182623]: 2026-01-22 22:41:36.976 182627 INFO nova.scheduler.client.report [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Deleted allocations for instance c93b2196-c404-45e0-93af-3dc1e7f48c5f#033[00m
Jan 22 17:41:37 np0005592767 nova_compute[182623]: 2026-01-22 22:41:37.020 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:37 np0005592767 nova_compute[182623]: 2026-01-22 22:41:37.087 182627 DEBUG oslo_concurrency.lockutils [None req-acc174e2-6709-4f9f-ac4a-4982e4c9061f fc0ad46b71ee49809728db8b218a1bbd e0007cc6ba2c4c2abd654f4226e30456 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.321 182627 DEBUG nova.compute.manager [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.323 182627 DEBUG oslo_concurrency.lockutils [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.324 182627 DEBUG oslo_concurrency.lockutils [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.325 182627 DEBUG oslo_concurrency.lockutils [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c93b2196-c404-45e0-93af-3dc1e7f48c5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.325 182627 DEBUG nova.compute.manager [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] No waiting events found dispatching network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.325 182627 WARNING nova.compute.manager [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received unexpected event network-vif-plugged-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:41:38 np0005592767 nova_compute[182623]: 2026-01-22 22:41:38.325 182627 DEBUG nova.compute.manager [req-91efad27-6eac-4b39-a3b6-c22c46d56c83 req-9bf0f0ad-42ad-48dc-9ec0-d4724ac304e4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Received event network-vif-deleted-d7e4ecf2-4b2c-4874-824e-e9ead6ecd2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.420 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.420 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.421 182627 INFO nova.compute.manager [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Unshelving#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.520 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.520 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.527 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'pci_requests' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.539 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'numa_topology' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.558 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.559 182627 INFO nova.compute.claims [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.678 182627 DEBUG nova.compute.provider_tree [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.702 182627 DEBUG nova.scheduler.client.report [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:39 np0005592767 nova_compute[182623]: 2026-01-22 22:41:39.751 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:40 np0005592767 nova_compute[182623]: 2026-01-22 22:41:40.086 182627 INFO nova.network.neutron [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updating port ffdd67b2-a5fd-4655-b692-8b8d34b65828 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 22 17:41:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:40.488 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:40 np0005592767 nova_compute[182623]: 2026-01-22 22:41:40.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:40 np0005592767 nova_compute[182623]: 2026-01-22 22:41:40.935 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.052 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.053 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquired lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.053 182627 DEBUG nova.network.neutron [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.142 182627 DEBUG nova.compute.manager [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-changed-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.143 182627 DEBUG nova.compute.manager [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Refreshing instance network info cache due to event network-changed-ffdd67b2-a5fd-4655-b692-8b8d34b65828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:41:41 np0005592767 nova_compute[182623]: 2026-01-22 22:41:41.143 182627 DEBUG oslo_concurrency.lockutils [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:41:41 np0005592767 podman[230947]: 2026-01-22 22:41:41.173421187 +0000 UTC m=+0.084071933 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:41:41 np0005592767 podman[230948]: 2026-01-22 22:41:41.187207666 +0000 UTC m=+0.094056785 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.022 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.928 182627 DEBUG nova.network.neutron [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updating instance_info_cache with network_info: [{"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.955 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Releasing lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.956 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.956 182627 INFO nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Creating image(s)#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.957 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.957 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.958 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.958 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.959 182627 DEBUG oslo_concurrency.lockutils [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.959 182627 DEBUG nova.network.neutron [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Refreshing network info cache for port ffdd67b2-a5fd-4655-b692-8b8d34b65828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.976 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "bfc2a2b0a125b37685b6254f144e511a6d354259" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:42 np0005592767 nova_compute[182623]: 2026-01-22 22:41:42.976 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "bfc2a2b0a125b37685b6254f144e511a6d354259" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.206 182627 DEBUG nova.network.neutron [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updated VIF entry in instance network info cache for port ffdd67b2-a5fd-4655-b692-8b8d34b65828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.207 182627 DEBUG nova.network.neutron [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updating instance_info_cache with network_info: [{"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.229 182627 DEBUG oslo_concurrency.lockutils [req-bffecbb0-f7fa-418c-9cc9-b01fb5542d9e req-c3680bc7-9ba0-4068-aaa9-4aa04f43b9ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.404 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.475 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.part --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.477 182627 DEBUG nova.virt.images [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] 5f4335bf-4b3b-43ea-ad0b-f20b635f2f10 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.479 182627 DEBUG nova.privsep.utils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.480 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.part /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.745 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.part /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.converted" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.764 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.798 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.866 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259.converted --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.867 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "bfc2a2b0a125b37685b6254f144e511a6d354259" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.879 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.946 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.947 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "bfc2a2b0a125b37685b6254f144e511a6d354259" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.948 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "bfc2a2b0a125b37685b6254f144e511a6d354259" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:45 np0005592767 nova_compute[182623]: 2026-01-22 22:41:45.960 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.047 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.048 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259,backing_fmt=raw /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.093 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259,backing_fmt=raw /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.095 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "bfc2a2b0a125b37685b6254f144e511a6d354259" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.095 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.178 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.180 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'migration_context' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.199 182627 INFO nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Rebasing disk image.#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.200 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.267 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:46 np0005592767 nova_compute[182623]: 2026-01-22 22:41:46.268 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e -F raw /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.841 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e -F raw /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk" returned: 0 in 1.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.842 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.842 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Ensure instance console log exists: /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.843 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.843 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.843 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.845 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Start _get_guest_xml network_info=[{"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='a913e0445a8f72b0b207359179797e62',container_format='bare',created_at=2026-01-22T22:41:20Z,direct_url=<?>,disk_format='qcow2',id=5f4335bf-4b3b-43ea-ad0b-f20b635f2f10,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1667734094-shelved',owner='5906f64d8ee84f068ff9caa68ae3652b',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-22T22:41:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.850 182627 WARNING nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.857 182627 DEBUG nova.virt.libvirt.host [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.858 182627 DEBUG nova.virt.libvirt.host [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.868 182627 DEBUG nova.virt.libvirt.host [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.869 182627 DEBUG nova.virt.libvirt.host [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.871 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.871 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='a913e0445a8f72b0b207359179797e62',container_format='bare',created_at=2026-01-22T22:41:20Z,direct_url=<?>,disk_format='qcow2',id=5f4335bf-4b3b-43ea-ad0b-f20b635f2f10,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1667734094-shelved',owner='5906f64d8ee84f068ff9caa68ae3652b',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-22T22:41:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.871 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.872 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.872 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.872 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.872 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.872 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.873 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.873 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.873 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.873 182627 DEBUG nova.virt.hardware [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.873 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.893 182627 DEBUG nova.virt.libvirt.vif [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:39:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1667734094',display_name='tempest-ServersNegativeTestJSON-server-1667734094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1667734094',id=130,image_ref='5f4335bf-4b3b-43ea-ad0b-f20b635f2f10',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-xnvred5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member',shelved_at='2026-01-22T22:41:27.461724',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:39Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=1fc01b1b-88f4-4078-a423-704c20c2ba9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.893 182627 DEBUG nova.network.os_vif_util [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.894 182627 DEBUG nova.network.os_vif_util [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.895 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.906 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <uuid>1fc01b1b-88f4-4078-a423-704c20c2ba9d</uuid>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <name>instance-00000082</name>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersNegativeTestJSON-server-1667734094</nova:name>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:41:47</nova:creationTime>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:user uuid="45cd11974e6648e1872fb5ebf9dee0b1">tempest-ServersNegativeTestJSON-2095273166-project-member</nova:user>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:project uuid="5906f64d8ee84f068ff9caa68ae3652b">tempest-ServersNegativeTestJSON-2095273166</nova:project>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="5f4335bf-4b3b-43ea-ad0b-f20b635f2f10"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        <nova:port uuid="ffdd67b2-a5fd-4655-b692-8b8d34b65828">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="serial">1fc01b1b-88f4-4078-a423-704c20c2ba9d</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="uuid">1fc01b1b-88f4-4078-a423-704c20c2ba9d</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.config"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:2b:ce:32"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <target dev="tapffdd67b2-a5"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/console.log" append="off"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <input type="keyboard" bus="usb"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:41:47 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:41:47 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:41:47 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:41:47 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.907 182627 DEBUG nova.compute.manager [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Preparing to wait for external event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.908 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.909 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.909 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.909 182627 DEBUG nova.virt.libvirt.vif [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:39:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1667734094',display_name='tempest-ServersNegativeTestJSON-server-1667734094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1667734094',id=130,image_ref='5f4335bf-4b3b-43ea-ad0b-f20b635f2f10',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:39:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-xnvred5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member',shelved_at='2026-01-22T22:41:27.461724',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:39Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=1fc01b1b-88f4-4078-a423-704c20c2ba9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.910 182627 DEBUG nova.network.os_vif_util [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.910 182627 DEBUG nova.network.os_vif_util [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.911 182627 DEBUG os_vif [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.911 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.911 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.912 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.914 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.914 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffdd67b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.915 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapffdd67b2-a5, col_values=(('external_ids', {'iface-id': 'ffdd67b2-a5fd-4655-b692-8b8d34b65828', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:ce:32', 'vm-uuid': '1fc01b1b-88f4-4078-a423-704c20c2ba9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:47 np0005592767 NetworkManager[54973]: <info>  [1769121707.9178] manager: (tapffdd67b2-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.921 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:47 np0005592767 nova_compute[182623]: 2026-01-22 22:41:47.925 182627 INFO os_vif [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5')#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.003 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.004 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.004 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] No VIF found with MAC fa:16:3e:2b:ce:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.004 182627 INFO nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Using config drive#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.019 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.065 182627 DEBUG nova.objects.instance [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'keypairs' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.819 182627 INFO nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Creating config drive at /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.config#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.830 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfr4u29mm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:48 np0005592767 nova_compute[182623]: 2026-01-22 22:41:48.960 182627 DEBUG oslo_concurrency.processutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfr4u29mm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:49 np0005592767 kernel: tapffdd67b2-a5: entered promiscuous mode
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.0734] manager: (tapffdd67b2-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.079 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:49Z|00519|binding|INFO|Claiming lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 for this chassis.
Jan 22 17:41:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:49Z|00520|binding|INFO|ffdd67b2-a5fd-4655-b692-8b8d34b65828: Claiming fa:16:3e:2b:ce:32 10.100.0.13
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.083 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.091 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.106 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:ce:32 10.100.0.13'], port_security=['fa:16:3e:2b:ce:32 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ffdd67b2-a5fd-4655-b692-8b8d34b65828) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.108 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ffdd67b2-a5fd-4655-b692-8b8d34b65828 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 bound to our chassis#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.110 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5#033[00m
Jan 22 17:41:49 np0005592767 systemd-machined[153912]: New machine qemu-68-instance-00000082.
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.131 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4f375750-7fcb-4691-b00a-8e5ae35d94df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.132 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ab2e5b-01 in ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.135 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ab2e5b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.135 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbc5300-50ff-4231-abc7-0f986dc65ecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.136 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1f403c16-e2f5-4935-8d46-d65fa08280fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 systemd[1]: Started Virtual Machine qemu-68-instance-00000082.
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.152 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b4f037-f5aa-4207-8562-346138ec0154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:49Z|00521|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 ovn-installed in OVS
Jan 22 17:41:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:49Z|00522|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 up in Southbound
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 systemd-udevd[231060]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.172 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1c0693-cce2-4c4e-870b-b0a62c6dbd33]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.1975] device (tapffdd67b2-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.1981] device (tapffdd67b2-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.214 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdc95f9-fa6f-43ed-970c-be27d8f9033e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.2223] manager: (tap17ab2e5b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.223 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3dcd192-9e9a-4195-84c9-8415f99fc058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 podman[231033]: 2026-01-22 22:41:49.223996642 +0000 UTC m=+0.142667306 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.266 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b55960b4-b6cf-4093-a765-c4b5fd98553c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.272 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[54b3ba54-18f5-4d3b-928d-cf5834a28f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.3076] device (tap17ab2e5b-00): carrier: link connected
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.320 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7361fb-9232-4dbf-b5c2-a3499020deb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.342 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5094928a-5892-4ef3-8f69-1031e8eb40d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530588, 'reachable_time': 18315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231098, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.364 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8d390f-3adf-417e-8f0a-baf8c030e237]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:d4d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530588, 'tstamp': 530588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231099, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.387 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6a415e10-4235-47c8-8877-3721eadc35a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530588, 'reachable_time': 18315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231100, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.433 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[96ae73a2-0a9b-4869-a3d8-aa80f11c5c85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.515 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[80c7b1df-2c98-48d5-9156-2a62c487b62c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.518 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.518 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.519 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ab2e5b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:49 np0005592767 kernel: tap17ab2e5b-00: entered promiscuous mode
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.523 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 NetworkManager[54973]: <info>  [1769121709.5277] manager: (tap17ab2e5b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.527 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ab2e5b-00, col_values=(('external_ids', {'iface-id': 'e1725d3a-3bc9-46b5-a1d1-153d0147aff7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.530 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:49 np0005592767 ovn_controller[94769]: 2026-01-22T22:41:49Z|00523|binding|INFO|Releasing lport e1725d3a-3bc9-46b5-a1d1-153d0147aff7 from this chassis (sb_readonly=0)
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.532 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.534 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[81c82048-274c-407f-aad3-d31da38fc32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.535 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:41:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:41:49.536 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'env', 'PROCESS_TAG=haproxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ab2e5b-049b-4984-a18a-6b3e44614ef5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:41:49 np0005592767 nova_compute[182623]: 2026-01-22 22:41:49.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:50 np0005592767 podman[231134]: 2026-01-22 22:41:50.028351528 +0000 UTC m=+0.051549205 container create 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:41:50 np0005592767 systemd[1]: Started libpod-conmon-9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc.scope.
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.088 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121710.087434, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.089 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Started (Lifecycle Event)#033[00m
Jan 22 17:41:50 np0005592767 podman[231134]: 2026-01-22 22:41:50.004608048 +0000 UTC m=+0.027805745 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.112 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.117 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121710.0879266, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.117 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:41:50 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:41:50 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb117da87b29be144f55891a42e1b6d1eb99efe708c152a12bdd6728ca433cb6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.140 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.144 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:41:50 np0005592767 podman[231134]: 2026-01-22 22:41:50.151015059 +0000 UTC m=+0.174212756 container init 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:41:50 np0005592767 podman[231134]: 2026-01-22 22:41:50.162007729 +0000 UTC m=+0.185205446 container start 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.173 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:41:50 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [NOTICE]   (231156) : New worker (231158) forked
Jan 22 17:41:50 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [NOTICE]   (231156) : Loading success.
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.730 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121695.7282507, c93b2196-c404-45e0-93af-3dc1e7f48c5f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.731 182627 INFO nova.compute.manager [-] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:41:50 np0005592767 nova_compute[182623]: 2026-01-22 22:41:50.762 182627 DEBUG nova.compute.manager [None req-3ca1a4b9-b613-483e-82d4-f86d27d56056 - - - - - -] [instance: c93b2196-c404-45e0-93af-3dc1e7f48c5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:52 np0005592767 nova_compute[182623]: 2026-01-22 22:41:52.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:52 np0005592767 nova_compute[182623]: 2026-01-22 22:41:52.918 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.106 182627 DEBUG nova.compute.manager [req-05e16936-6c96-45bb-8174-5681bd6e84e7 req-9bb849bc-c2ae-4037-83cd-720c93be9c9c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.108 182627 DEBUG oslo_concurrency.lockutils [req-05e16936-6c96-45bb-8174-5681bd6e84e7 req-9bb849bc-c2ae-4037-83cd-720c93be9c9c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.108 182627 DEBUG oslo_concurrency.lockutils [req-05e16936-6c96-45bb-8174-5681bd6e84e7 req-9bb849bc-c2ae-4037-83cd-720c93be9c9c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.109 182627 DEBUG oslo_concurrency.lockutils [req-05e16936-6c96-45bb-8174-5681bd6e84e7 req-9bb849bc-c2ae-4037-83cd-720c93be9c9c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.109 182627 DEBUG nova.compute.manager [req-05e16936-6c96-45bb-8174-5681bd6e84e7 req-9bb849bc-c2ae-4037-83cd-720c93be9c9c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Processing event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.111 182627 DEBUG nova.compute.manager [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.116 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121713.116583, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.117 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.121 182627 DEBUG nova.virt.libvirt.driver [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.126 182627 INFO nova.virt.libvirt.driver [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Instance spawned successfully.#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.144 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.152 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.191 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:41:53 np0005592767 nova_compute[182623]: 2026-01-22 22:41:53.954 182627 DEBUG nova.compute.manager [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:41:54 np0005592767 nova_compute[182623]: 2026-01-22 22:41:54.074 182627 DEBUG oslo_concurrency.lockutils [None req-a165fed9-dc8d-4b23-be2b-61b84d5ff6ca 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 14.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.216 182627 DEBUG nova.compute.manager [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.216 182627 DEBUG oslo_concurrency.lockutils [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.217 182627 DEBUG oslo_concurrency.lockutils [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.217 182627 DEBUG oslo_concurrency.lockutils [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.217 182627 DEBUG nova.compute.manager [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:41:55 np0005592767 nova_compute[182623]: 2026-01-22 22:41:55.217 182627 WARNING nova.compute.manager [req-3756767f-5542-495f-bf98-5b148c6a8592 req-559b05d4-70e4-4ae1-9e61-cfdeb24eeffa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:41:57 np0005592767 nova_compute[182623]: 2026-01-22 22:41:57.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:57 np0005592767 nova_compute[182623]: 2026-01-22 22:41:57.921 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.032 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.033 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.050 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.150 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.151 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.159 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.160 182627 INFO nova.compute.claims [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.373 182627 DEBUG nova.compute.provider_tree [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.396 182627 DEBUG nova.scheduler.client.report [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.433 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.461 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "0e2e41ac-2dcc-48c4-9ef9-979462b6661e" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.461 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "0e2e41ac-2dcc-48c4-9ef9-979462b6661e" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.472 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "0e2e41ac-2dcc-48c4-9ef9-979462b6661e" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.474 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.526 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.527 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.547 182627 INFO nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.569 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.692 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.699 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.700 182627 INFO nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Creating image(s)#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.702 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.702 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.704 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.732 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.775 182627 DEBUG nova.policy [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5cbb91df5f144eadae17eed4a2d57d77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06011c8d6bc84dc89089b46ecd599b94', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.808 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.809 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.810 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.826 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.894 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.895 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.929 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.930 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.931 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.985 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.989 182627 DEBUG nova.virt.disk.api [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Checking if we can resize image /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:41:58 np0005592767 nova_compute[182623]: 2026-01-22 22:41:58.989 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.046 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.048 182627 DEBUG nova.virt.disk.api [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Cannot resize image /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.049 182627 DEBUG nova.objects.instance [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f785f08-848b-4f0c-8abd-1c873b56739b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.067 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.068 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Ensure instance console log exists: /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.069 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.069 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:41:59 np0005592767 nova_compute[182623]: 2026-01-22 22:41:59.070 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:00 np0005592767 podman[231182]: 2026-01-22 22:42:00.239773224 +0000 UTC m=+0.135876505 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.497 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Successfully created port: 723c776a-4b17-4616-9452-6583a71b0739 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.836 182627 DEBUG nova.objects.instance [None req-dedc5c2a-7857-49db-b9d0-90cc34072936 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.864 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121720.8640587, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.864 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.882 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.905 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:00 np0005592767 nova_compute[182623]: 2026-01-22 22:42:00.926 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.326 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Successfully updated port: 723c776a-4b17-4616-9452-6583a71b0739 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.345 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.346 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquired lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.346 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.407 182627 DEBUG nova.compute.manager [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-changed-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.408 182627 DEBUG nova.compute.manager [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Refreshing instance network info cache due to event network-changed-723c776a-4b17-4616-9452-6583a71b0739. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.408 182627 DEBUG oslo_concurrency.lockutils [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.484 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:42:01 np0005592767 kernel: tapffdd67b2-a5 (unregistering): left promiscuous mode
Jan 22 17:42:01 np0005592767 NetworkManager[54973]: <info>  [1769121721.5914] device (tapffdd67b2-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:01Z|00524|binding|INFO|Releasing lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 from this chassis (sb_readonly=0)
Jan 22 17:42:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:01Z|00525|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 down in Southbound
Jan 22 17:42:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:01Z|00526|binding|INFO|Removing iface tapffdd67b2-a5 ovn-installed in OVS
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.606 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:ce:32 10.100.0.13'], port_security=['fa:16:3e:2b:ce:32 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ffdd67b2-a5fd-4655-b692-8b8d34b65828) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.609 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ffdd67b2-a5fd-4655-b692-8b8d34b65828 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 unbound from our chassis#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.605 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.613 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.616 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef4cb5b-65be-4f0b-88bd-421d7cb09473]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.617 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace which is not needed anymore#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.628 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:01 np0005592767 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 22 17:42:01 np0005592767 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000082.scope: Consumed 9.313s CPU time.
Jan 22 17:42:01 np0005592767 systemd-machined[153912]: Machine qemu-68-instance-00000082 terminated.
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.801 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [NOTICE]   (231156) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [NOTICE]   (231156) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [WARNING]  (231156) : Exiting Master process...
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [WARNING]  (231156) : Exiting Master process...
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [ALERT]    (231156) : Current worker (231158) exited with code 143 (Terminated)
Jan 22 17:42:01 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231152]: [WARNING]  (231156) : All workers exited. Exiting... (0)
Jan 22 17:42:01 np0005592767 systemd[1]: libpod-9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc.scope: Deactivated successfully.
Jan 22 17:42:01 np0005592767 podman[231231]: 2026-01-22 22:42:01.830484206 +0000 UTC m=+0.069606925 container died 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.857 182627 DEBUG nova.compute.manager [None req-dedc5c2a-7857-49db-b9d0-90cc34072936 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:01 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc-userdata-shm.mount: Deactivated successfully.
Jan 22 17:42:01 np0005592767 systemd[1]: var-lib-containers-storage-overlay-eb117da87b29be144f55891a42e1b6d1eb99efe708c152a12bdd6728ca433cb6-merged.mount: Deactivated successfully.
Jan 22 17:42:01 np0005592767 podman[231231]: 2026-01-22 22:42:01.890115919 +0000 UTC m=+0.129238668 container cleanup 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:42:01 np0005592767 systemd[1]: libpod-conmon-9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc.scope: Deactivated successfully.
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.942 182627 DEBUG nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.943 182627 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.944 182627 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.945 182627 DEBUG oslo_concurrency.lockutils [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.945 182627 DEBUG nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.946 182627 WARNING nova.compute.manager [req-a9b38a45-00c2-4586-96e8-36324720e20c req-71c7c2ac-2528-42f6-ae4a-aceebe48c968 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:42:01 np0005592767 podman[231277]: 2026-01-22 22:42:01.967858482 +0000 UTC m=+0.045078153 container remove 9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.977 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dd6213-8cff-406e-9b4e-d50b3ad2ffba]: (4, ('Thu Jan 22 10:42:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc)\n9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc\nThu Jan 22 10:42:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc)\n9f242450d7f3e7815c6c5c02ca49fa7938c504a0f8d459c519d6b6981ac97fcc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.979 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[090d31e0-1260-425c-b4d1-625c00dd0980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:01.981 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:01 np0005592767 nova_compute[182623]: 2026-01-22 22:42:01.984 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:01 np0005592767 kernel: tap17ab2e5b-00: left promiscuous mode
Jan 22 17:42:02 np0005592767 nova_compute[182623]: 2026-01-22 22:42:02.004 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.010 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a7d284-c385-49a1-84fe-f490b2b78f76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:02 np0005592767 nova_compute[182623]: 2026-01-22 22:42:02.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.034 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e57aa56a-3174-461e-beb5-5346c968336c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.036 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbc398f-ef80-494b-ab28-9567328c415a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.050 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4a8475-3cb7-4cd9-a2a7-195de10625a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530578, 'reachable_time': 28327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231297, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.055 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:42:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:02.055 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[00e9061f-7f77-4646-a33f-cd984c5e053d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:02 np0005592767 systemd[1]: run-netns-ovnmeta\x2d17ab2e5b\x2d049b\x2d4984\x2da18a\x2d6b3e44614ef5.mount: Deactivated successfully.
Jan 22 17:42:02 np0005592767 nova_compute[182623]: 2026-01-22 22:42:02.924 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.644 182627 DEBUG nova.network.neutron [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Updating instance_info_cache with network_info: [{"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.666 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Releasing lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.667 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Instance network_info: |[{"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.668 182627 DEBUG oslo_concurrency.lockutils [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.669 182627 DEBUG nova.network.neutron [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Refreshing network info cache for port 723c776a-4b17-4616-9452-6583a71b0739 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.674 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Start _get_guest_xml network_info=[{"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.681 182627 WARNING nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.695 182627 DEBUG nova.virt.libvirt.host [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.696 182627 DEBUG nova.virt.libvirt.host [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.700 182627 DEBUG nova.virt.libvirt.host [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.702 182627 DEBUG nova.virt.libvirt.host [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.704 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.705 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.706 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.707 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.707 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.708 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.709 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.709 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.710 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.711 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.712 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.712 182627 DEBUG nova.virt.hardware [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.719 182627 DEBUG nova.virt.libvirt.vif [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1191926936',display_name='tempest-ServerGroupTestJSON-server-1191926936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1191926936',id=139,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06011c8d6bc84dc89089b46ecd599b94',ramdisk_id='',reservation_id='r-yzdt3fr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-2145025844',owner_user_name='tempest-ServerGroupTestJSON-2145025844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:58Z,user_data=None,user_id='5cbb91df5f144eadae17eed4a2d57d77',uuid=5f785f08-848b-4f0c-8abd-1c873b56739b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.720 182627 DEBUG nova.network.os_vif_util [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converting VIF {"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.722 182627 DEBUG nova.network.os_vif_util [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.724 182627 DEBUG nova.objects.instance [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f785f08-848b-4f0c-8abd-1c873b56739b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.778 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <uuid>5f785f08-848b-4f0c-8abd-1c873b56739b</uuid>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <name>instance-0000008b</name>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerGroupTestJSON-server-1191926936</nova:name>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:42:03</nova:creationTime>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:user uuid="5cbb91df5f144eadae17eed4a2d57d77">tempest-ServerGroupTestJSON-2145025844-project-member</nova:user>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:project uuid="06011c8d6bc84dc89089b46ecd599b94">tempest-ServerGroupTestJSON-2145025844</nova:project>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        <nova:port uuid="723c776a-4b17-4616-9452-6583a71b0739">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="serial">5f785f08-848b-4f0c-8abd-1c873b56739b</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="uuid">5f785f08-848b-4f0c-8abd-1c873b56739b</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.config"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:9f:d7:b2"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <target dev="tap723c776a-4b"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/console.log" append="off"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:42:03 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:42:03 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:42:03 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:42:03 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.780 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Preparing to wait for external event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.780 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.780 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.781 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.782 182627 DEBUG nova.virt.libvirt.vif [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:41:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1191926936',display_name='tempest-ServerGroupTestJSON-server-1191926936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1191926936',id=139,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='06011c8d6bc84dc89089b46ecd599b94',ramdisk_id='',reservation_id='r-yzdt3fr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-2145025844',owner_user_name='tempest-ServerGroupTestJSON-2145025844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:41:58Z,user_data=None,user_id='5cbb91df5f144eadae17eed4a2d57d77',uuid=5f785f08-848b-4f0c-8abd-1c873b56739b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.782 182627 DEBUG nova.network.os_vif_util [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converting VIF {"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.783 182627 DEBUG nova.network.os_vif_util [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.784 182627 DEBUG os_vif [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.785 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.786 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.787 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.793 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap723c776a-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.794 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap723c776a-4b, col_values=(('external_ids', {'iface-id': '723c776a-4b17-4616-9452-6583a71b0739', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:d7:b2', 'vm-uuid': '5f785f08-848b-4f0c-8abd-1c873b56739b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:03 np0005592767 NetworkManager[54973]: <info>  [1769121723.8350] manager: (tap723c776a-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.833 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.837 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.846 182627 INFO os_vif [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b')#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.909 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.910 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.910 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] No VIF found with MAC fa:16:3e:9f:d7:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.910 182627 INFO nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Using config drive#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.954 182627 INFO nova.compute.manager [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Resuming#033[00m
Jan 22 17:42:03 np0005592767 nova_compute[182623]: 2026-01-22 22:42:03.955 182627 DEBUG nova.objects.instance [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'flavor' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.002 182627 DEBUG oslo_concurrency.lockutils [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.003 182627 DEBUG oslo_concurrency.lockutils [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquired lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.003 182627 DEBUG nova.network.neutron [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.049 182627 DEBUG nova.compute.manager [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.050 182627 DEBUG oslo_concurrency.lockutils [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.050 182627 DEBUG oslo_concurrency.lockutils [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.052 182627 DEBUG oslo_concurrency.lockutils [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.052 182627 DEBUG nova.compute.manager [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.053 182627 WARNING nova.compute.manager [req-7d512dad-791f-41ae-abcb-057629dbbbe6 req-34f082c5-b459-4367-aa0e-8809e1d68585 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.301 182627 INFO nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Creating config drive at /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.config#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.314 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4u2pzc7z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.457 182627 DEBUG oslo_concurrency.processutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4u2pzc7z" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:04 np0005592767 kernel: tap723c776a-4b: entered promiscuous mode
Jan 22 17:42:04 np0005592767 systemd-udevd[231211]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.5347] manager: (tap723c776a-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Jan 22 17:42:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:04Z|00527|binding|INFO|Claiming lport 723c776a-4b17-4616-9452-6583a71b0739 for this chassis.
Jan 22 17:42:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:04Z|00528|binding|INFO|723c776a-4b17-4616-9452-6583a71b0739: Claiming fa:16:3e:9f:d7:b2 10.100.0.5
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.538 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.5487] device (tap723c776a-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.5564] device (tap723c776a-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.550 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:d7:b2 10.100.0.5'], port_security=['fa:16:3e:9f:d7:b2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f785f08-848b-4f0c-8abd-1c873b56739b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-422fd61a-583c-421e-a792-399fb219c567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06011c8d6bc84dc89089b46ecd599b94', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f6c64980-f624-4ba5-a19a-9de15bfc1264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3040919d-0025-43d8-92fd-351389824ff4, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=723c776a-4b17-4616-9452-6583a71b0739) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.552 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 723c776a-4b17-4616-9452-6583a71b0739 in datapath 422fd61a-583c-421e-a792-399fb219c567 bound to our chassis#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.555 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 422fd61a-583c-421e-a792-399fb219c567#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.568 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[15a50c2e-f61f-4322-9013-f75cac2dc0a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.568 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap422fd61a-51 in ovnmeta-422fd61a-583c-421e-a792-399fb219c567 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.572 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap422fd61a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.572 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[74679849-dcb4-4be8-abda-6130ccb7a42c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.573 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d93dd877-708f-424c-9e46-00f4f063e4e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 systemd-machined[153912]: New machine qemu-69-instance-0000008b.
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.587 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[68a6e993-a10c-4a24-a3a4-991deef0620d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.600 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 systemd[1]: Started Virtual Machine qemu-69-instance-0000008b.
Jan 22 17:42:04 np0005592767 podman[231309]: 2026-01-22 22:42:04.60882916 +0000 UTC m=+0.103652795 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.614 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.615 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[25769b68-91ae-44f1-aace-51ed9fe4f393]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:04Z|00529|binding|INFO|Setting lport 723c776a-4b17-4616-9452-6583a71b0739 ovn-installed in OVS
Jan 22 17:42:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:04Z|00530|binding|INFO|Setting lport 723c776a-4b17-4616-9452-6583a71b0739 up in Southbound
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.620 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.660 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a1b35e-cc6a-46d7-a184-a0d20b2812bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 podman[231305]: 2026-01-22 22:42:04.663365079 +0000 UTC m=+0.157921897 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.6686] manager: (tap422fd61a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.667 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9f806893-e1d3-4125-96c2-202673095392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.705 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2dc739-5ffa-4180-9d97-caea056707a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 systemd-udevd[231380]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.709 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d28a4633-9341-40c3-8ec8-aae2822ef6c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.7391] device (tap422fd61a-50): carrier: link connected
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.745 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d21a0d-511f-4bd9-b92c-f3db1acc9b64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.760 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cee638f2-2cde-4a6a-a76f-318306f027de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap422fd61a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:4f:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532131, 'reachable_time': 38361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231401, 'error': None, 'target': 'ovnmeta-422fd61a-583c-421e-a792-399fb219c567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56e9180c-b018-43fa-aa92-c1154e2f64bc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:4ff6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532131, 'tstamp': 532131}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231402, 'error': None, 'target': 'ovnmeta-422fd61a-583c-421e-a792-399fb219c567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.800 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[226d64f0-0c6c-40b1-9a75-f7a5ea4ca0a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap422fd61a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:4f:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532131, 'reachable_time': 38361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231403, 'error': None, 'target': 'ovnmeta-422fd61a-583c-421e-a792-399fb219c567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.845 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[188ab82f-1026-49a7-a74f-216fba49c194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.926 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[47a268c4-7f0b-476b-ad8c-719bd74f3f23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.927 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap422fd61a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.928 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.928 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap422fd61a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.974 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 kernel: tap422fd61a-50: entered promiscuous mode
Jan 22 17:42:04 np0005592767 NetworkManager[54973]: <info>  [1769121724.9750] manager: (tap422fd61a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.981 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:04.982 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap422fd61a-50, col_values=(('external_ids', {'iface-id': '3754bc21-131a-4cd8-916f-c6ea0e8161a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.984 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:04Z|00531|binding|INFO|Releasing lport 3754bc21-131a-4cd8-916f-c6ea0e8161a3 from this chassis (sb_readonly=0)
Jan 22 17:42:04 np0005592767 nova_compute[182623]: 2026-01-22 22:42:04.997 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.001 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.002 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/422fd61a-583c-421e-a792-399fb219c567.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/422fd61a-583c-421e-a792-399fb219c567.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.003 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fd14c267-9b21-4e74-be64-e5310bae6e25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.004 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-422fd61a-583c-421e-a792-399fb219c567
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/422fd61a-583c-421e-a792-399fb219c567.pid.haproxy
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 422fd61a-583c-421e-a792-399fb219c567
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.004 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-422fd61a-583c-421e-a792-399fb219c567', 'env', 'PROCESS_TAG=haproxy-422fd61a-583c-421e-a792-399fb219c567', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/422fd61a-583c-421e-a792-399fb219c567.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.097 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121725.0965247, 5f785f08-848b-4f0c-8abd-1c873b56739b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.098 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] VM Started (Lifecycle Event)#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.124 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.128 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121725.0968506, 5f785f08-848b-4f0c-8abd-1c873b56739b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.128 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.151 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.155 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.174 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:42:05 np0005592767 podman[231446]: 2026-01-22 22:42:05.403828221 +0000 UTC m=+0.051895315 container create 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:42:05 np0005592767 systemd[1]: Started libpod-conmon-1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524.scope.
Jan 22 17:42:05 np0005592767 podman[231446]: 2026-01-22 22:42:05.376166561 +0000 UTC m=+0.024233675 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:42:05 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:42:05 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56ada2a4f18e6ed3bfa7c02e0929deeb5190e582704aacd525dbadcbbc43ecad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:42:05 np0005592767 podman[231446]: 2026-01-22 22:42:05.492229016 +0000 UTC m=+0.140296160 container init 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:42:05 np0005592767 podman[231446]: 2026-01-22 22:42:05.499436369 +0000 UTC m=+0.147503473 container start 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:42:05 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [NOTICE]   (231466) : New worker (231468) forked
Jan 22 17:42:05 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [NOTICE]   (231466) : Loading success.
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.562 182627 DEBUG nova.network.neutron [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Updated VIF entry in instance network info cache for port 723c776a-4b17-4616-9452-6583a71b0739. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.562 182627 DEBUG nova.network.neutron [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Updating instance_info_cache with network_info: [{"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.564 182627 DEBUG nova.network.neutron [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updating instance_info_cache with network_info: [{"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.586 182627 DEBUG oslo_concurrency.lockutils [req-f9956e17-0862-4b0f-b23e-36d4b2b3a1ac req-43b4839d-4794-49c8-a7fc-e2c7727999b5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-5f785f08-848b-4f0c-8abd-1c873b56739b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.589 182627 DEBUG oslo_concurrency.lockutils [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Releasing lock "refresh_cache-1fc01b1b-88f4-4078-a423-704c20c2ba9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.594 182627 DEBUG nova.virt.libvirt.vif [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:39:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1667734094',display_name='tempest-ServersNegativeTestJSON-server-1667734094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1667734094',id=130,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-xnvred5b',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:01Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=1fc01b1b-88f4-4078-a423-704c20c2ba9d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.594 182627 DEBUG nova.network.os_vif_util [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.595 182627 DEBUG nova.network.os_vif_util [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.595 182627 DEBUG os_vif [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.596 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.596 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.596 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.600 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.601 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffdd67b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.601 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapffdd67b2-a5, col_values=(('external_ids', {'iface-id': 'ffdd67b2-a5fd-4655-b692-8b8d34b65828', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:ce:32', 'vm-uuid': '1fc01b1b-88f4-4078-a423-704c20c2ba9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.601 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.602 182627 INFO os_vif [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5')#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.621 182627 DEBUG nova.objects.instance [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'numa_topology' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:05 np0005592767 kernel: tapffdd67b2-a5: entered promiscuous mode
Jan 22 17:42:05 np0005592767 NetworkManager[54973]: <info>  [1769121725.7205] manager: (tapffdd67b2-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Jan 22 17:42:05 np0005592767 systemd-udevd[231393]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.722 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:05Z|00532|binding|INFO|Claiming lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 for this chassis.
Jan 22 17:42:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:05Z|00533|binding|INFO|ffdd67b2-a5fd-4655-b692-8b8d34b65828: Claiming fa:16:3e:2b:ce:32 10.100.0.13
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.735 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:ce:32 10.100.0.13'], port_security=['fa:16:3e:2b:ce:32 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ffdd67b2-a5fd-4655-b692-8b8d34b65828) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.736 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ffdd67b2-a5fd-4655-b692-8b8d34b65828 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 bound to our chassis#033[00m
Jan 22 17:42:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:05Z|00534|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 ovn-installed in OVS
Jan 22 17:42:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:05Z|00535|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 up in Southbound
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.739 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5#033[00m
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.739 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 NetworkManager[54973]: <info>  [1769121725.7414] device (tapffdd67b2-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:42:05 np0005592767 NetworkManager[54973]: <info>  [1769121725.7420] device (tapffdd67b2-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:42:05 np0005592767 nova_compute[182623]: 2026-01-22 22:42:05.742 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.751 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4535af-cd9c-4a8c-98ac-d4643091b2bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.752 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ab2e5b-01 in ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.755 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ab2e5b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.755 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3838071a-7bea-40a1-9c0f-8fa554992b1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.757 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[afbb4eab-42c5-46a6-8908-e000f37bbd39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 systemd-machined[153912]: New machine qemu-70-instance-00000082.
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.770 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a41009-a2de-4fe3-99c4-f170914b88cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 systemd[1]: Started Virtual Machine qemu-70-instance-00000082.
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.786 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4e315e-6f1c-4043-b26a-39c9b1e36834]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.822 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[02bb64e4-6de8-4383-9d8d-765b8101d2d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.827 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c41248-4dd6-4749-ba00-6775ccc9bb4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 NetworkManager[54973]: <info>  [1769121725.8301] manager: (tap17ab2e5b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.862 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[67e18173-2f45-4a01-b3bd-a55630cf93c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.866 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f893f6-4bef-4bbb-a2d3-c89a16632d40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 NetworkManager[54973]: <info>  [1769121725.8997] device (tap17ab2e5b-00): carrier: link connected
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.907 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7d0e64-5729-4a88-ae4c-74c279651d48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.925 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b2a1fd-0069-443a-a531-6b31dcec137f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532247, 'reachable_time': 34895, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231509, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.941 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1c30569b-0c88-4d9d-a53f-cf64ea673bc3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:d4d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532247, 'tstamp': 532247}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231510, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.959 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e032d4d2-838a-4e91-a7a3-989f4f2a0103]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ab2e5b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:d4:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532247, 'reachable_time': 34895, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231511, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:05.986 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a8b126-6e5c-4c70-8865-2bee990e4a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.066 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[10f2fbe4-7c6c-4918-9c71-79d91b400e97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.068 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.068 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.069 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ab2e5b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:06 np0005592767 kernel: tap17ab2e5b-00: entered promiscuous mode
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.079 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:06 np0005592767 NetworkManager[54973]: <info>  [1769121726.0811] manager: (tap17ab2e5b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.082 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.084 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ab2e5b-00, col_values=(('external_ids', {'iface-id': 'e1725d3a-3bc9-46b5-a1d1-153d0147aff7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:06 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:06Z|00536|binding|INFO|Releasing lport e1725d3a-3bc9-46b5-a1d1-153d0147aff7 from this chassis (sb_readonly=0)
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.114 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.115 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b72302a9-155a-4b12-a2ea-39323fbba73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.116 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/17ab2e5b-049b-4984-a18a-6b3e44614ef5.pid.haproxy
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 17ab2e5b-049b-4984-a18a-6b3e44614ef5
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:42:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:06.117 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'env', 'PROCESS_TAG=haproxy-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ab2e5b-049b-4984-a18a-6b3e44614ef5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.124 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.124 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.124 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Processing event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.125 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] No waiting events found dispatching network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 WARNING nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received unexpected event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.126 182627 DEBUG oslo_concurrency.lockutils [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.127 182627 DEBUG nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.127 182627 WARNING nova.compute.manager [req-e3d6b71b-f563-442a-a28c-18cf69a12e4d req-6b049452-4731-4f36-a4f4-24ca2916cbc3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.127 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.132 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121726.1319778, 5f785f08-848b-4f0c-8abd-1c873b56739b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.132 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.134 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.140 182627 INFO nova.virt.libvirt.driver [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Instance spawned successfully.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.140 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.161 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.167 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.169 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.170 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.170 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.170 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.171 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.171 182627 DEBUG nova.virt.libvirt.driver [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.194 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.233 182627 INFO nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Took 7.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.234 182627 DEBUG nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.248 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 1fc01b1b-88f4-4078-a423-704c20c2ba9d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.249 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121726.248259, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.249 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Started (Lifecycle Event)#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.271 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.283 182627 DEBUG nova.compute.manager [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.284 182627 DEBUG nova.objects.instance [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.288 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.312 182627 INFO nova.virt.libvirt.driver [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Instance running successfully.#033[00m
Jan 22 17:42:06 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.324 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.324 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121726.2652318, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.324 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.330 182627 DEBUG nova.virt.libvirt.guest [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.330 182627 DEBUG nova.compute.manager [None req-eee68a51-9dca-48f3-afcc-2cc56ecccf4d 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.343 182627 INFO nova.compute.manager [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Took 8.24 seconds to build instance.#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.345 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.350 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.375 182627 DEBUG oslo_concurrency.lockutils [None req-de6bd9b3-1210-4762-a79a-bdcba42305fd 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:06 np0005592767 nova_compute[182623]: 2026-01-22 22:42:06.385 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:42:06 np0005592767 podman[231549]: 2026-01-22 22:42:06.567700222 +0000 UTC m=+0.073438564 container create 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 17:42:06 np0005592767 systemd[1]: Started libpod-conmon-3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959.scope.
Jan 22 17:42:06 np0005592767 podman[231549]: 2026-01-22 22:42:06.536617875 +0000 UTC m=+0.042356237 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:42:06 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:42:06 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c43d10702ca5c526fbe7bb44a13de3f17faf19113f0af2b4433d741bce00ba2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:42:06 np0005592767 podman[231549]: 2026-01-22 22:42:06.663850695 +0000 UTC m=+0.169589037 container init 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:06 np0005592767 podman[231549]: 2026-01-22 22:42:06.669896375 +0000 UTC m=+0.175634717 container start 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 17:42:06 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [NOTICE]   (231568) : New worker (231570) forked
Jan 22 17:42:06 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [NOTICE]   (231568) : Loading success.
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.032 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.093 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.093 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.093 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.094 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.094 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.104 182627 INFO nova.compute.manager [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Terminating instance#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.115 182627 DEBUG nova.compute.manager [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:42:07 np0005592767 kernel: tap723c776a-4b (unregistering): left promiscuous mode
Jan 22 17:42:07 np0005592767 NetworkManager[54973]: <info>  [1769121727.1400] device (tap723c776a-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.188 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:07Z|00537|binding|INFO|Releasing lport 723c776a-4b17-4616-9452-6583a71b0739 from this chassis (sb_readonly=0)
Jan 22 17:42:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:07Z|00538|binding|INFO|Setting lport 723c776a-4b17-4616-9452-6583a71b0739 down in Southbound
Jan 22 17:42:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:07Z|00539|binding|INFO|Removing iface tap723c776a-4b ovn-installed in OVS
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.199 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:d7:b2 10.100.0.5'], port_security=['fa:16:3e:9f:d7:b2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f785f08-848b-4f0c-8abd-1c873b56739b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-422fd61a-583c-421e-a792-399fb219c567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '06011c8d6bc84dc89089b46ecd599b94', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f6c64980-f624-4ba5-a19a-9de15bfc1264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3040919d-0025-43d8-92fd-351389824ff4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=723c776a-4b17-4616-9452-6583a71b0739) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.200 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 723c776a-4b17-4616-9452-6583a71b0739 in datapath 422fd61a-583c-421e-a792-399fb219c567 unbound from our chassis#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.201 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 422fd61a-583c-421e-a792-399fb219c567, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.202 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b237395a-c813-4846-a5ba-0bed97362e7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.203 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-422fd61a-583c-421e-a792-399fb219c567 namespace which is not needed anymore#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.206 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 22 17:42:07 np0005592767 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008b.scope: Consumed 1.415s CPU time.
Jan 22 17:42:07 np0005592767 systemd-machined[153912]: Machine qemu-69-instance-0000008b terminated.
Jan 22 17:42:07 np0005592767 NetworkManager[54973]: <info>  [1769121727.3335] manager: (tap723c776a-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/252)
Jan 22 17:42:07 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [NOTICE]   (231466) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:07 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [NOTICE]   (231466) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:07 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [ALERT]    (231466) : Current worker (231468) exited with code 143 (Terminated)
Jan 22 17:42:07 np0005592767 neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567[231462]: [WARNING]  (231466) : All workers exited. Exiting... (0)
Jan 22 17:42:07 np0005592767 systemd[1]: libpod-1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524.scope: Deactivated successfully.
Jan 22 17:42:07 np0005592767 podman[231600]: 2026-01-22 22:42:07.367327714 +0000 UTC m=+0.057966417 container died 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.401 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000082', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5906f64d8ee84f068ff9caa68ae3652b', 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'hostId': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:42:07 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524-userdata-shm.mount: Deactivated successfully.
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.404 182627 INFO nova.virt.libvirt.driver [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Instance destroyed successfully.#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.405 182627 DEBUG nova.objects.instance [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lazy-loading 'resources' on Instance uuid 5f785f08-848b-4f0c-8abd-1c873b56739b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.410 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5f785f08-848b-4f0c-8abd-1c873b56739b', 'name': 'tempest-ServerGroupTestJSON-server-1191926936', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008b', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '06011c8d6bc84dc89089b46ecd599b94', 'user_id': '5cbb91df5f144eadae17eed4a2d57d77', 'hostId': '15eb77effd43edb9c253da8884d316c82fac459c210ca60858d137c1', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.411 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:42:07 np0005592767 systemd[1]: var-lib-containers-storage-overlay-56ada2a4f18e6ed3bfa7c02e0929deeb5190e582704aacd525dbadcbbc43ecad-merged.mount: Deactivated successfully.
Jan 22 17:42:07 np0005592767 podman[231600]: 2026-01-22 22:42:07.425825405 +0000 UTC m=+0.116464108 container cleanup 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:07 np0005592767 systemd[1]: libpod-conmon-1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524.scope: Deactivated successfully.
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.438 182627 DEBUG nova.virt.libvirt.vif [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1191926936',display_name='tempest-ServerGroupTestJSON-server-1191926936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1191926936',id=139,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:42:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='06011c8d6bc84dc89089b46ecd599b94',ramdisk_id='',reservation_id='r-yzdt3fr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-2145025844',owner_user_name='tempest-ServerGroupTestJSON-2145025844-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:06Z,user_data=None,user_id='5cbb91df5f144eadae17eed4a2d57d77',uuid=5f785f08-848b-4f0c-8abd-1c873b56739b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.439 182627 DEBUG nova.network.os_vif_util [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converting VIF {"id": "723c776a-4b17-4616-9452-6583a71b0739", "address": "fa:16:3e:9f:d7:b2", "network": {"id": "422fd61a-583c-421e-a792-399fb219c567", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-276523502-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "06011c8d6bc84dc89089b46ecd599b94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723c776a-4b", "ovs_interfaceid": "723c776a-4b17-4616-9452-6583a71b0739", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.439 182627 DEBUG nova.network.os_vif_util [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.439 182627 DEBUG os_vif [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.441 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.441 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap723c776a-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.444 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.445 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.447 182627 INFO os_vif [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:d7:b2,bridge_name='br-int',has_traffic_filtering=True,id=723c776a-4b17-4616-9452-6583a71b0739,network=Network(422fd61a-583c-421e-a792-399fb219c567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723c776a-4b')#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.447 182627 INFO nova.virt.libvirt.driver [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Deleting instance files /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b_del#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.448 182627 INFO nova.virt.libvirt.driver [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Deletion of /var/lib/nova/instances/5f785f08-848b-4f0c-8abd-1c873b56739b_del complete#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.461 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.462 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.464 12 DEBUG ceilometer.compute.pollsters [-] Instance 5f785f08-848b-4f0c-8abd-1c873b56739b was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2282ff64-e2f1-421f-886b-791f538e251e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.411735', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94cd9f30-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '4af90b9b19e269759d1cb8e339f2c739c766ca7fc683c2acaad0baaa6af9d379'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.411735', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94cdb38a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': 'f628bf1a44a50cb648fe37d290a709fc96911d5e223f6c3adb65af66fda843eb'}]}, 'timestamp': '2026-01-22 22:42:07.464372', '_unique_id': 'b6e5337d4c8d45568cb48ee24bb1b0ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.466 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.469 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.497 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.498 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 1fc01b1b-88f4-4078-a423-704c20c2ba9d: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.499 12 DEBUG ceilometer.compute.pollsters [-] Instance 5f785f08-848b-4f0c-8abd-1c873b56739b was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.499 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.499 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>]
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.502 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1fc01b1b-88f4-4078-a423-704c20c2ba9d / tapffdd67b2-a5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.502 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.503 12 DEBUG ceilometer.compute.pollsters [-] Instance 5f785f08-848b-4f0c-8abd-1c873b56739b was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6309116-da09-48a5-b2cd-cdc5e7436a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.500339', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94d3cf5e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': 'f84bc9b5751ea7f26be39c6d15a3f8b6f1038a034ce1184c1494316a5b7e50a8'}]}, 'timestamp': '2026-01-22 22:42:07.504040', '_unique_id': 'c30c92e409ee4c07a2e20bae92a66ab0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.505 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.508 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.outgoing.bytes volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 podman[231649]: 2026-01-22 22:42:07.508370244 +0000 UTC m=+0.054855589 container remove 1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.510 12 DEBUG ceilometer.compute.pollsters [-] Instance 5f785f08-848b-4f0c-8abd-1c873b56739b was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '431a3616-9ab0-4cc2-a0d0-3f81270a1fc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 300, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.508145', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94d4a276-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '267427061ea7e09ebfbf9c780b4fc5b09115dcbd68ea95b36e7cb5d2af3c6665'}]}, 'timestamp': '2026-01-22 22:42:07.510370', '_unique_id': '2aeeb3a268224bf68d46d841cedfbe41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.512 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.513 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.513 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.513 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>]
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.513 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.513 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.514 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.514 182627 INFO nova.compute.manager [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.515 182627 DEBUG oslo.service.loopingcall [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.515 182627 DEBUG nova.compute.manager [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.515 182627 DEBUG nova.network.neutron [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.515 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.516 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[14fd4bc4-13c8-4766-9269-86aab0e42639]: (4, ('Thu Jan 22 10:42:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567 (1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524)\n1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524\nThu Jan 22 10:42:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-422fd61a-583c-421e-a792-399fb219c567 (1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524)\n1372caa72ec185779bd018ea5581c1015468bd47415c943523463e9a7298d524\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.517 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1f3c80-9359-46ba-b53b-0000780ef6d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9045fdc-b47b-4965-afa3-6171fb3c562e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.513719', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94d57516-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '96240b3f8ff989c9e917084ebc5f6bff029f5aff36667f73d0919426c118faa0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.513719', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94d583d0-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '368116215aeab867539f3ec3ab834bb9c385334ebe1ed93446b4ed15f4c67c48'}]}, 'timestamp': '2026-01-22 22:42:07.516592', '_unique_id': 'afbad5e6c9b74977b47d5a5c65a273b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.518 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap422fd61a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.517 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.519 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.519 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.520 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 kernel: tap422fd61a-50: left promiscuous mode
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.524 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.523 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53be028e-7b3f-401c-9efb-9e9116f76082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.519216', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94d64ce8-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '116cacbeec9a31c4d9bb3f48a6c3c54fca7ffa8026e11f616553a526be24a176'}]}, 'timestamp': '2026-01-22 22:42:07.524995', '_unique_id': '52a598ddc5c04688b56f7e09ceff607e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.526 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7eff0431-8c96-4827-b8bf-9aebe2fdc530]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.525 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.527 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.527 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>]
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.527 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.529 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce603ed6-38f7-4124-b989-39061455f09a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.527954', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94d7a44e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '0635c29cbf508ce95eeb4e8ce22b7890c4b6594d766872368a5ce3d57f46851b'}]}, 'timestamp': '2026-01-22 22:42:07.529747', '_unique_id': '8ef11252126e47399f9184ed0b9e99ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.530 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.532 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.532 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.532 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 nova_compute[182623]: 2026-01-22 22:42:07.536 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.537 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ae485a-85ca-4b68-b768-61f26eb6dc67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.532369', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94d84df4-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': 'c75d127c3c7cdcd1b37f528f33defb83c455af04440b117031ed90828522b71c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.532369', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94d85b0a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '49efec875d3d8ab4d15fc325c65204030f6385e9ee23626d08d0e40ec7afd32e'}]}, 'timestamp': '2026-01-22 22:42:07.537866', '_unique_id': '817b729ef60d4d64a4f33d4c65969589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.538 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.547 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[09a9fd29-5d14-4bc5-8906-80e5c1db925f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.548 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[595e1641-2672-4eb7-b353-a96f6a04b2e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.553 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.554 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.555 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ca8db25-9761-4dab-99fc-6803bfb2b272', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.540143', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94db9720-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': 'eaaeafdcfe73423f83ae00a9e3212eca53e3cd72a8ccbd7c5228db37196c7e67'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.540143', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94dba63e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': '21fd83fa3b5393da79f8901533c3053c2e8bf10a1014a24c4bfd7b3cf378541d'}]}, 'timestamp': '2026-01-22 22:42:07.555528', '_unique_id': '5299d088ee1949bb8e8c231bf8853c7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.556 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.557 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.557 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.558 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '846deaeb-a94a-4f2a-b0d8-f11d8a4e552f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.557666', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94dc2a28-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': 'e8c19f98f98c1698e38fcbd0f6d36fe4fc6b315cd9297be864de92023da449f9'}]}, 'timestamp': '2026-01-22 22:42:07.558692', '_unique_id': 'df1bf7f3eb874fdf8a7c16a40bf559d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.559 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.560 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.561 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.562 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8daa42ca-ecb9-4403-82b7-890c2977844e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532123, 'reachable_time': 33976, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231664, 'error': None, 'target': 'ovnmeta-422fd61a-583c-421e-a792-399fb219c567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75bdfb13-9aec-4ea6-97a2-9485dd0c25d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.560815', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94dca52a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '0a5e4453e023a291db9f24db2c629a624e9d8e8aec91fb3984aa1204fc16b179'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.560815', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94dcb42a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '533aab609d966594393314728ad865e6225c52c257b106af8733c38fb942dd60'}]}, 'timestamp': '2026-01-22 22:42:07.562267', '_unique_id': '0782cdf66bbc4f44a03a1c7633064335'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.562 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.564 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.outgoing.packets volume: 5 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.565 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 systemd[1]: run-netns-ovnmeta\x2d422fd61a\x2d583c\x2d421e\x2da792\x2d399fb219c567.mount: Deactivated successfully.
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.566 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-422fd61a-583c-421e-a792-399fb219c567 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:42:07 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:07.566 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[202d1156-bf52-468a-856d-6c4237abfb22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '222c9a25-b146-4d33-902b-cfadb96c8b78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 5, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.564551', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94dd3738-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '603c55d07c4a7b200c8027f6b09cf9a7fc28b9013fb13f711e6f59e1f37d2645'}]}, 'timestamp': '2026-01-22 22:42:07.565611', '_unique_id': 'd6ed1d6e6be544c0a31235ac43d5e982'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.566 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.568 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.568 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.570 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae4a9a3d-116f-43e5-9671-08efc201d4ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.568455', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94ddcffe-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': '43127fd61ca4a78e9d780d8449e419d4b81000ed2f2bcc0436ee6cff7ee4bf57'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.568455', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94dddd5a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': 'b5a11cf911210f091cd634e4558e5a58f6817adbc7c5158abdba364dc7d9bc24'}]}, 'timestamp': '2026-01-22 22:42:07.570350', '_unique_id': '595178f4dcd641dc941887a99e2ac5c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.571 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.572 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.572 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.573 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.574 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3770caad-cdf7-4ef5-a5ce-d13e1705a3db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.572882', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94de7c7e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': '03b63c74c54ea414925628e10c7735fb227ef4e5a0eb1c0c8948ae5103afc667'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.572882', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94de8afc-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.175050123, 'message_signature': '51890acf1c663b3f5097eff12c61cb13d70e0482167fec16343c497e6c840fea'}]}, 'timestamp': '2026-01-22 22:42:07.574628', '_unique_id': '0e326a4678fc476288cb0ef15c7c0b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.575 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.576 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.577 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ff3de27-8a54-4459-a969-a5464360142b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.576801', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94df15da-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '26ea99e38078fd183e3c1e02137ab20f738a6fc2a34f4695f8e895f6025e7258'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.576801', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94df253e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': 'fb22ef73fcb7dc8b1acef0bac0218ed79f5ddc80fb56e1a3b8cfd1e1ad777835'}]}, 'timestamp': '2026-01-22 22:42:07.578281', '_unique_id': '43d004e784d34eb8b71668cbc3581443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.578 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.580 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.580 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.580 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.581 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7462af2d-8ad7-4ea1-a87b-06b5de14dd65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-vda', 'timestamp': '2026-01-22T22:42:07.580471', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '94dfa4dc-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': 'b2b8800c0409421e02d5579dc8b26b19e1c47a2a7e366ea97eca8eb0a8bfdb2a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d-sda', 'timestamp': '2026-01-22T22:42:07.580471', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '94dfb22e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.04664155, 'message_signature': '13bf06b2f365d48a170b74248a9d8542f1767e9d28df02c752eb4a80fe60d95d'}]}, 'timestamp': '2026-01-22 22:42:07.581967', '_unique_id': '7aec41e3b64f4a3c947a2f4146861b6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.582 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.586 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.587 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/cpu volume: 1100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.588 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a803761-50bf-42a2-910f-fd271a355b53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100000000, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'timestamp': '2026-01-22T22:42:07.587019', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'instance-00000082', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '94e0a99a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.132347088, 'message_signature': '26ab6a65f263cb8468a09d630fec5fe2a3fcfad8b323967cdeb315903107fc30'}]}, 'timestamp': '2026-01-22 22:42:07.588613', '_unique_id': '2d807ab221d341f7ac9714678ab0f193'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.589 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.590 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.591 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '054e43fc-9578-46b0-b0b2-883097241a85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.590766', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94e13680-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '3e624702ed3f7a01f08c8f3312781de1bb28cc20fa2dafdd9ff7d09e810c2194'}]}, 'timestamp': '2026-01-22 22:42:07.591724', '_unique_id': 'df2c17180c1440b1bb94e0851bf57128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.592 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.594 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.594 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bacc16cc-cb24-48b4-b55c-8835e700a598', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.594052', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94e1b6c8-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '3d61c219cdd57c22b8759f3205b27069710f7d98e274c1306d2c105aa890fb2f'}]}, 'timestamp': '2026-01-22 22:42:07.595064', '_unique_id': '4c27fe662dc441b8b83dae4e5b1b2874'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.595 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.596 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.597 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4686256b-b1a1-437d-ad49-8a6810c630e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.596550', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94e21690-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '33a19720e3e947dc9b11a0687d9d56938460799d9a930ac5a99023730b917b7e'}]}, 'timestamp': '2026-01-22 22:42:07.597436', '_unique_id': '27be4e37855a4d66a59db0199f709715'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.598 12 DEBUG ceilometer.compute.pollsters [-] 1fc01b1b-88f4-4078-a423-704c20c2ba9d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b'
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.599 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000008b, id=5f785f08-848b-4f0c-8abd-1c873b56739b>: [Error Code 42] Domain not found: no domain with matching uuid '5f785f08-848b-4f0c-8abd-1c873b56739b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f1015b5-f0a1-4fe0-afc9-d7e462cc66d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '45cd11974e6648e1872fb5ebf9dee0b1', 'user_name': None, 'project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'project_name': None, 'resource_id': 'instance-00000082-1fc01b1b-88f4-4078-a423-704c20c2ba9d-tapffdd67b2-a5', 'timestamp': '2026-01-22T22:42:07.598896', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1667734094', 'name': 'tapffdd67b2-a5', 'instance_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'instance_type': 'm1.nano', 'host': 'afdf07bfb09def9eef802ab3e8ad7eea3976551d91f00e474ed30253', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10'}, 'image_ref': '5f4335bf-4b3b-43ea-ad0b-f20b635f2f10', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2b:ce:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapffdd67b2-a5'}, 'message_id': '94e27392-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5324.135191098, 'message_signature': '45a021c37c563705e9d522869a6f85f345aa93b9d35cca4502e8983f6106a6e2'}]}, 'timestamp': '2026-01-22 22:42:07.599814', '_unique_id': '59d591da46e94e22a80f4385703201b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.600 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.601 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:42:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:42:07.601 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1667734094>, <NovaLikeServer: tempest-ServerGroupTestJSON-server-1191926936>]
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.090 182627 DEBUG nova.network.neutron [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.107 182627 INFO nova.compute.manager [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Took 0.59 seconds to deallocate network for instance.#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.268 182627 DEBUG nova.compute.manager [req-4caf8c47-c436-4380-b8d2-32d17a0e3455 req-1bc1676a-089c-40bd-be45-5c7ea1964e34 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-deleted-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.275 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.275 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.276 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.276 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.276 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.276 182627 WARNING nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.277 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-unplugged-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.277 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.277 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.278 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.278 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] No waiting events found dispatching network-vif-unplugged-723c776a-4b17-4616-9452-6583a71b0739 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.278 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-unplugged-723c776a-4b17-4616-9452-6583a71b0739 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.279 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.279 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.279 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.279 182627 DEBUG oslo_concurrency.lockutils [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.279 182627 DEBUG nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] No waiting events found dispatching network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.280 182627 WARNING nova.compute.manager [req-59f09b38-5891-49f6-886c-a8e402e7a851 req-24b22596-8b15-4156-8250-1ff23981106b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Received unexpected event network-vif-plugged-723c776a-4b17-4616-9452-6583a71b0739 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.347 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.348 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.429 182627 DEBUG nova.compute.provider_tree [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.463 182627 DEBUG nova.scheduler.client.report [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.487 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.553 182627 INFO nova.scheduler.client.report [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Deleted allocations for instance 5f785f08-848b-4f0c-8abd-1c873b56739b#033[00m
Jan 22 17:42:08 np0005592767 nova_compute[182623]: 2026-01-22 22:42:08.778 182627 DEBUG oslo_concurrency.lockutils [None req-10225997-33e0-484e-827c-d4e76a3dfc70 5cbb91df5f144eadae17eed4a2d57d77 06011c8d6bc84dc89089b46ecd599b94 - - default default] Lock "5f785f08-848b-4f0c-8abd-1c873b56739b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:12 np0005592767 nova_compute[182623]: 2026-01-22 22:42:12.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:12.114 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:12.114 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:12 np0005592767 podman[231685]: 2026-01-22 22:42:12.155947101 +0000 UTC m=+0.069303427 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:42:12 np0005592767 podman[231684]: 2026-01-22 22:42:12.167942409 +0000 UTC m=+0.076037797 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 17:42:12 np0005592767 nova_compute[182623]: 2026-01-22 22:42:12.486 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:12 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:12Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:ce:32 10.100.0.13
Jan 22 17:42:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:13Z|00540|binding|INFO|Releasing lport e1725d3a-3bc9-46b5-a1d1-153d0147aff7 from this chassis (sb_readonly=0)
Jan 22 17:42:13 np0005592767 nova_compute[182623]: 2026-01-22 22:42:13.226 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.667 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.667 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.668 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.668 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.668 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.691 182627 INFO nova.compute.manager [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Terminating instance#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.702 182627 DEBUG nova.compute.manager [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:42:15 np0005592767 kernel: tapffdd67b2-a5 (unregistering): left promiscuous mode
Jan 22 17:42:15 np0005592767 NetworkManager[54973]: <info>  [1769121735.7284] device (tapffdd67b2-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.735 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:15Z|00541|binding|INFO|Releasing lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 from this chassis (sb_readonly=0)
Jan 22 17:42:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:15Z|00542|binding|INFO|Setting lport ffdd67b2-a5fd-4655-b692-8b8d34b65828 down in Southbound
Jan 22 17:42:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:15Z|00543|binding|INFO|Removing iface tapffdd67b2-a5 ovn-installed in OVS
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.737 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:15 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.738 182627 DEBUG nova.compute.manager [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 22 17:42:15 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:42:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:15.744 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:ce:32 10.100.0.13'], port_security=['fa:16:3e:2b:ce:32 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1fc01b1b-88f4-4078-a423-704c20c2ba9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5906f64d8ee84f068ff9caa68ae3652b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '06798119-3cf9-4579-b6fe-7ef0a3f57792', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af00d925-c6e8-4c1e-8ae7-75c6556913d1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ffdd67b2-a5fd-4655-b692-8b8d34b65828) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:15.747 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ffdd67b2-a5fd-4655-b692-8b8d34b65828 in datapath 17ab2e5b-049b-4984-a18a-6b3e44614ef5 unbound from our chassis#033[00m
Jan 22 17:42:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:15.750 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ab2e5b-049b-4984-a18a-6b3e44614ef5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.752 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:15.752 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[989d90d5-4d95-4140-9490-4672b7af80ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:15.754 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 namespace which is not needed anymore#033[00m
Jan 22 17:42:15 np0005592767 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 22 17:42:15 np0005592767 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000082.scope: Consumed 5.925s CPU time.
Jan 22 17:42:15 np0005592767 systemd-machined[153912]: Machine qemu-70-instance-00000082 terminated.
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.855 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.856 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.877 182627 DEBUG nova.objects.instance [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_requests' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.890 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.890 182627 INFO nova.compute.claims [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.890 182627 DEBUG nova.objects.instance [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.902 182627 DEBUG nova.objects.instance [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [NOTICE]   (231568) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [NOTICE]   (231568) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [WARNING]  (231568) : Exiting Master process...
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [WARNING]  (231568) : Exiting Master process...
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [ALERT]    (231568) : Current worker (231570) exited with code 143 (Terminated)
Jan 22 17:42:15 np0005592767 neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5[231564]: [WARNING]  (231568) : All workers exited. Exiting... (0)
Jan 22 17:42:15 np0005592767 systemd[1]: libpod-3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959.scope: Deactivated successfully.
Jan 22 17:42:15 np0005592767 conmon[231564]: conmon 3c14694c97279bbdfe60 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959.scope/container/memory.events
Jan 22 17:42:15 np0005592767 podman[231750]: 2026-01-22 22:42:15.923977518 +0000 UTC m=+0.042907782 container died 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:15 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959-userdata-shm.mount: Deactivated successfully.
Jan 22 17:42:15 np0005592767 systemd[1]: var-lib-containers-storage-overlay-5c43d10702ca5c526fbe7bb44a13de3f17faf19113f0af2b4433d741bce00ba2-merged.mount: Deactivated successfully.
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.959 182627 INFO nova.compute.resource_tracker [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating resource usage from migration 543c8230-9924-4708-9d79-4550f2d7f7ee#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.960 182627 DEBUG nova.compute.resource_tracker [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Starting to track incoming migration 543c8230-9924-4708-9d79-4550f2d7f7ee with flavor 617fb2f8-2c15-4939-a64a-90fca4acd12a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 22 17:42:15 np0005592767 podman[231750]: 2026-01-22 22:42:15.962991959 +0000 UTC m=+0.081922173 container cleanup 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.968 182627 INFO nova.virt.libvirt.driver [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Instance destroyed successfully.#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.969 182627 DEBUG nova.objects.instance [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lazy-loading 'resources' on Instance uuid 1fc01b1b-88f4-4078-a423-704c20c2ba9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:15 np0005592767 systemd[1]: libpod-conmon-3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959.scope: Deactivated successfully.
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.995 182627 DEBUG nova.virt.libvirt.vif [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T22:39:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1667734094',display_name='tempest-ServersNegativeTestJSON-server-1667734094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1667734094',id=130,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5906f64d8ee84f068ff9caa68ae3652b',ramdisk_id='',reservation_id='r-xnvred5b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2095273166',owner_user_name='tempest-ServersNegativeTestJSON-2095273166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:06Z,user_data=None,user_id='45cd11974e6648e1872fb5ebf9dee0b1',uuid=1fc01b1b-88f4-4078-a423-704c20c2ba9d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.996 182627 DEBUG nova.network.os_vif_util [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converting VIF {"id": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "address": "fa:16:3e:2b:ce:32", "network": {"id": "17ab2e5b-049b-4984-a18a-6b3e44614ef5", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-508927540-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5906f64d8ee84f068ff9caa68ae3652b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapffdd67b2-a5", "ovs_interfaceid": "ffdd67b2-a5fd-4655-b692-8b8d34b65828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.996 182627 DEBUG nova.network.os_vif_util [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.996 182627 DEBUG os_vif [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.997 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:15 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.998 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffdd67b2-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:15.999 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.001 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.003 182627 INFO os_vif [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:ce:32,bridge_name='br-int',has_traffic_filtering=True,id=ffdd67b2-a5fd-4655-b692-8b8d34b65828,network=Network(17ab2e5b-049b-4984-a18a-6b3e44614ef5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffdd67b2-a5')#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.003 182627 INFO nova.virt.libvirt.driver [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Deleting instance files /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d_del#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.008 182627 INFO nova.virt.libvirt.driver [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Deletion of /var/lib/nova/instances/1fc01b1b-88f4-4078-a423-704c20c2ba9d_del complete#033[00m
Jan 22 17:42:16 np0005592767 podman[231796]: 2026-01-22 22:42:16.028577448 +0000 UTC m=+0.038850686 container remove 3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.033 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[53f62933-add7-4de1-8bb9-804b0c225291]: (4, ('Thu Jan 22 10:42:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959)\n3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959\nThu Jan 22 10:42:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 (3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959)\n3c14694c97279bbdfe60f1240c217421c1c66f252c63c6b347802686a60a6959\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.035 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4696a203-6cfd-4a97-8b28-3294b05f2b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.036 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ab2e5b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.038 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:16 np0005592767 kernel: tap17ab2e5b-00: left promiscuous mode
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.096 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.099 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fb7560-5546-4ae0-856c-72163d4e3836]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.102 182627 INFO nova.compute.manager [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.103 182627 DEBUG oslo.service.loopingcall [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.105 182627 DEBUG nova.compute.manager [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.106 182627 DEBUG nova.network.neutron [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.111 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee94e74-e80b-4051-a41e-9fe452ffc01c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.112 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbb32a4-8283-4525-8e09-b00b2fd60922]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.125 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3e8d9e-d2c1-49f2-9dbf-e2852216b4d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532239, 'reachable_time': 27731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231811, 'error': None, 'target': 'ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.127 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ab2e5b-049b-4984-a18a-6b3e44614ef5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:42:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:16.128 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[260a7a5a-95e2-4f7e-b80e-2a715f24e81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:16 np0005592767 systemd[1]: run-netns-ovnmeta\x2d17ab2e5b\x2d049b\x2d4984\x2da18a\x2d6b3e44614ef5.mount: Deactivated successfully.
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.132 182627 DEBUG nova.compute.provider_tree [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.146 182627 DEBUG nova.scheduler.client.report [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.169 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.169 182627 INFO nova.compute.manager [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Migrating#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.206 182627 DEBUG nova.compute.manager [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.206 182627 DEBUG oslo_concurrency.lockutils [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.206 182627 DEBUG oslo_concurrency.lockutils [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.207 182627 DEBUG oslo_concurrency.lockutils [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.207 182627 DEBUG nova.compute.manager [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:16 np0005592767 nova_compute[182623]: 2026-01-22 22:42:16.207 182627 DEBUG nova.compute.manager [req-1baebdb0-35f9-40d1-839a-4ad3eaa27757 req-64aacf21-9550-4656-936c-5019a63d0541 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-unplugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.692 182627 DEBUG nova.network.neutron [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.714 182627 INFO nova.compute.manager [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Took 1.61 seconds to deallocate network for instance.#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.843 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.844 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.874 182627 DEBUG nova.compute.manager [req-33c3c4d2-07e7-4f39-ac3b-6af232d8188d req-4cc9fb89-c7b0-421a-a04b-a4a4a9e5d4cc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-deleted-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.904 182627 DEBUG nova.compute.provider_tree [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.917 182627 DEBUG nova.scheduler.client.report [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.935 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:17 np0005592767 nova_compute[182623]: 2026-01-22 22:42:17.959 182627 INFO nova.scheduler.client.report [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Deleted allocations for instance 1fc01b1b-88f4-4078-a423-704c20c2ba9d#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.025 182627 DEBUG oslo_concurrency.lockutils [None req-e1e58e8d-1061-4cd6-a8d9-b3625e5d16ec 45cd11974e6648e1872fb5ebf9dee0b1 5906f64d8ee84f068ff9caa68ae3652b - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.449 182627 DEBUG nova.compute.manager [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.450 182627 DEBUG oslo_concurrency.lockutils [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.450 182627 DEBUG oslo_concurrency.lockutils [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.450 182627 DEBUG oslo_concurrency.lockutils [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1fc01b1b-88f4-4078-a423-704c20c2ba9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.451 182627 DEBUG nova.compute.manager [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] No waiting events found dispatching network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:18 np0005592767 nova_compute[182623]: 2026-01-22 22:42:18.451 182627 WARNING nova.compute.manager [req-8ee77a14-a54d-4cf1-80cf-8f2c02751d0e req-2cf7263d-f1c9-4723-bcad-2696c27a33a9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Received unexpected event network-vif-plugged-ffdd67b2-a5fd-4655-b692-8b8d34b65828 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:42:19 np0005592767 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 17:42:19 np0005592767 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 17:42:19 np0005592767 systemd-logind[802]: New session 65 of user nova.
Jan 22 17:42:19 np0005592767 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 17:42:19 np0005592767 systemd[1]: Starting User Manager for UID 42436...
Jan 22 17:42:19 np0005592767 systemd[231816]: Queued start job for default target Main User Target.
Jan 22 17:42:19 np0005592767 systemd[231816]: Created slice User Application Slice.
Jan 22 17:42:19 np0005592767 systemd[231816]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:42:19 np0005592767 systemd[231816]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 17:42:19 np0005592767 systemd[231816]: Reached target Paths.
Jan 22 17:42:19 np0005592767 systemd[231816]: Reached target Timers.
Jan 22 17:42:19 np0005592767 systemd[231816]: Starting D-Bus User Message Bus Socket...
Jan 22 17:42:19 np0005592767 systemd[231816]: Starting Create User's Volatile Files and Directories...
Jan 22 17:42:19 np0005592767 systemd[231816]: Listening on D-Bus User Message Bus Socket.
Jan 22 17:42:19 np0005592767 systemd[231816]: Reached target Sockets.
Jan 22 17:42:19 np0005592767 systemd[231816]: Finished Create User's Volatile Files and Directories.
Jan 22 17:42:19 np0005592767 systemd[231816]: Reached target Basic System.
Jan 22 17:42:19 np0005592767 systemd[231816]: Reached target Main User Target.
Jan 22 17:42:19 np0005592767 systemd[231816]: Startup finished in 138ms.
Jan 22 17:42:19 np0005592767 systemd[1]: Started User Manager for UID 42436.
Jan 22 17:42:19 np0005592767 systemd[1]: Started Session 65 of User nova.
Jan 22 17:42:19 np0005592767 systemd[1]: session-65.scope: Deactivated successfully.
Jan 22 17:42:19 np0005592767 podman[231830]: 2026-01-22 22:42:19.421867765 +0000 UTC m=+0.063187564 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:42:19 np0005592767 systemd-logind[802]: Session 65 logged out. Waiting for processes to exit.
Jan 22 17:42:19 np0005592767 systemd-logind[802]: Removed session 65.
Jan 22 17:42:19 np0005592767 systemd-logind[802]: New session 67 of user nova.
Jan 22 17:42:19 np0005592767 systemd[1]: Started Session 67 of User nova.
Jan 22 17:42:19 np0005592767 systemd[1]: session-67.scope: Deactivated successfully.
Jan 22 17:42:19 np0005592767 systemd-logind[802]: Session 67 logged out. Waiting for processes to exit.
Jan 22 17:42:19 np0005592767 systemd-logind[802]: Removed session 67.
Jan 22 17:42:21 np0005592767 nova_compute[182623]: 2026-01-22 22:42:21.002 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.400 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121727.398992, 5f785f08-848b-4f0c-8abd-1c873b56739b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.400 182627 INFO nova.compute.manager [-] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.450 182627 DEBUG nova.compute.manager [None req-45d58b49-30ca-4577-bfe2-df108979d4ab - - - - - -] [instance: 5f785f08-848b-4f0c-8abd-1c873b56739b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.993 182627 DEBUG nova.compute.manager [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.993 182627 DEBUG oslo_concurrency.lockutils [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.993 182627 DEBUG oslo_concurrency.lockutils [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.994 182627 DEBUG oslo_concurrency.lockutils [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.994 182627 DEBUG nova.compute.manager [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:22 np0005592767 nova_compute[182623]: 2026-01-22 22:42:22.994 182627 WARNING nova.compute.manager [req-df559ffd-ffe6-4b92-8476-a834942730d5 req-c9eb31db-e08e-4672-9759-2ec95ffd18d4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 22 17:42:23 np0005592767 systemd-logind[802]: New session 68 of user nova.
Jan 22 17:42:23 np0005592767 systemd[1]: Started Session 68 of User nova.
Jan 22 17:42:23 np0005592767 systemd[1]: session-68.scope: Deactivated successfully.
Jan 22 17:42:23 np0005592767 systemd-logind[802]: Session 68 logged out. Waiting for processes to exit.
Jan 22 17:42:23 np0005592767 systemd-logind[802]: Removed session 68.
Jan 22 17:42:23 np0005592767 systemd-logind[802]: New session 69 of user nova.
Jan 22 17:42:23 np0005592767 systemd[1]: Started Session 69 of User nova.
Jan 22 17:42:23 np0005592767 systemd[1]: session-69.scope: Deactivated successfully.
Jan 22 17:42:23 np0005592767 systemd-logind[802]: Session 69 logged out. Waiting for processes to exit.
Jan 22 17:42:23 np0005592767 systemd-logind[802]: Removed session 69.
Jan 22 17:42:24 np0005592767 systemd-logind[802]: New session 70 of user nova.
Jan 22 17:42:24 np0005592767 systemd[1]: Started Session 70 of User nova.
Jan 22 17:42:24 np0005592767 systemd[1]: session-70.scope: Deactivated successfully.
Jan 22 17:42:24 np0005592767 systemd-logind[802]: Session 70 logged out. Waiting for processes to exit.
Jan 22 17:42:24 np0005592767 systemd-logind[802]: Removed session 70.
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.715 182627 INFO nova.network.neutron [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating port a04aec10-3dd3-4092-868b-9f7dcd6b9f57 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.926 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.927 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:24 np0005592767 nova_compute[182623]: 2026-01-22 22:42:24.955 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.095 182627 DEBUG nova.compute.manager [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.095 182627 DEBUG oslo_concurrency.lockutils [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.095 182627 DEBUG oslo_concurrency.lockutils [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.096 182627 DEBUG oslo_concurrency.lockutils [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.096 182627 DEBUG nova.compute.manager [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:25 np0005592767 nova_compute[182623]: 2026-01-22 22:42:25.096 182627 WARNING nova.compute.manager [req-80aaa92f-216e-4301-b900-fc9b3734f037 req-111399a2-ff16-4435-b8f4-168efb9eee79 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.262 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.565 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating instance_info_cache with network_info: [{"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.590 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.591 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.591 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.592 182627 DEBUG nova.network.neutron [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.594 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.915 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:26 np0005592767 nova_compute[182623]: 2026-01-22 22:42:26.916 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.042 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.114 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.116 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5617MB free_disk=73.0929183959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.116 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.117 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.166 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Applying migration context for instance fbc39b42-1887-45e6-ba92-560d868f205a as it has an incoming, in-progress migration 543c8230-9924-4708-9d79-4550f2d7f7ee. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.166 182627 INFO nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating resource usage from migration 543c8230-9924-4708-9d79-4550f2d7f7ee#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.206 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance fbc39b42-1887-45e6-ba92-560d868f205a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.207 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.207 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.268 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.284 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.320 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.320 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.636 182627 DEBUG nova.compute.manager [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-changed-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.637 182627 DEBUG nova.compute.manager [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Refreshing instance network info cache due to event network-changed-a04aec10-3dd3-4092-868b-9f7dcd6b9f57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:42:27 np0005592767 nova_compute[182623]: 2026-01-22 22:42:27.637 182627 DEBUG oslo_concurrency.lockutils [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.321 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.322 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.332 182627 DEBUG nova.network.neutron [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating instance_info_cache with network_info: [{"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.346 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.349 182627 DEBUG oslo_concurrency.lockutils [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.350 182627 DEBUG nova.network.neutron [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Refreshing network info cache for port a04aec10-3dd3-4092-868b-9f7dcd6b9f57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.470 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.471 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.471 182627 INFO nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Creating image(s)#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.472 182627 DEBUG nova.objects.instance [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.483 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.546 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.548 182627 DEBUG nova.virt.disk.api [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.549 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.614 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.615 182627 DEBUG nova.virt.disk.api [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.661 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.661 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Ensure instance console log exists: /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.662 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.662 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.662 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.665 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Start _get_guest_xml network_info=[{"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1458626390", "vif_mac": "fa:16:3e:2c:d0:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.671 182627 WARNING nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.674 182627 DEBUG nova.virt.libvirt.host [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.675 182627 DEBUG nova.virt.libvirt.host [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.678 182627 DEBUG nova.virt.libvirt.host [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.679 182627 DEBUG nova.virt.libvirt.host [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.681 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.681 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='617fb2f8-2c15-4939-a64a-90fca4acd12a',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.682 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.682 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.683 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.683 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.684 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.684 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.684 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.685 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.685 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.686 182627 DEBUG nova.virt.hardware [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.686 182627 DEBUG nova.objects.instance [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.706 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.799 182627 DEBUG oslo_concurrency.processutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.config --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.801 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.802 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.803 182627 DEBUG oslo_concurrency.lockutils [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.806 182627 DEBUG nova.virt.libvirt.vif [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1231100165',display_name='tempest-TestNetworkAdvancedServerOps-server-1231100165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1231100165',id=137,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF0Q2rEqFYnzPDWz7KNBg99UGu3qTb9IVDNiNmsD9om8SfswfoGYDys6DfRzAtSWD6o6yQdZOshqf9KTS3NVI6woZ/k6+5rtAFCDguLkEfqWMp6b8PEmaWSl9Coz34h48w==',key_name='tempest-TestNetworkAdvancedServerOps-143414390',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-74uvtc7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:42:24Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=fbc39b42-1887-45e6-ba92-560d868f205a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1458626390", "vif_mac": "fa:16:3e:2c:d0:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.806 182627 DEBUG nova.network.os_vif_util [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1458626390", "vif_mac": "fa:16:3e:2c:d0:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.808 182627 DEBUG nova.network.os_vif_util [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.812 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <uuid>fbc39b42-1887-45e6-ba92-560d868f205a</uuid>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <name>instance-00000089</name>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <memory>196608</memory>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1231100165</nova:name>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:42:28</nova:creationTime>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.micro">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:memory>192</nova:memory>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        <nova:port uuid="a04aec10-3dd3-4092-868b-9f7dcd6b9f57">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="serial">fbc39b42-1887-45e6-ba92-560d868f205a</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="uuid">fbc39b42-1887-45e6-ba92-560d868f205a</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/disk.config"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:2c:d0:ea"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <target dev="tapa04aec10-3d"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a/console.log" append="off"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:42:28 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:42:28 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:42:28 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:42:28 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.814 182627 DEBUG nova.virt.libvirt.vif [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1231100165',display_name='tempest-TestNetworkAdvancedServerOps-server-1231100165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1231100165',id=137,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF0Q2rEqFYnzPDWz7KNBg99UGu3qTb9IVDNiNmsD9om8SfswfoGYDys6DfRzAtSWD6o6yQdZOshqf9KTS3NVI6woZ/k6+5rtAFCDguLkEfqWMp6b8PEmaWSl9Coz34h48w==',key_name='tempest-TestNetworkAdvancedServerOps-143414390',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:41:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-74uvtc7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:42:24Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=fbc39b42-1887-45e6-ba92-560d868f205a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1458626390", "vif_mac": "fa:16:3e:2c:d0:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.815 182627 DEBUG nova.network.os_vif_util [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1458626390", "vif_mac": "fa:16:3e:2c:d0:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.815 182627 DEBUG nova.network.os_vif_util [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.816 182627 DEBUG os_vif [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.816 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.817 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.818 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.821 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.821 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa04aec10-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.822 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa04aec10-3d, col_values=(('external_ids', {'iface-id': 'a04aec10-3dd3-4092-868b-9f7dcd6b9f57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:d0:ea', 'vm-uuid': 'fbc39b42-1887-45e6-ba92-560d868f205a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.824 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 NetworkManager[54973]: <info>  [1769121748.8271] manager: (tapa04aec10-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.827 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.835 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.837 182627 INFO os_vif [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d')#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.894 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.895 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.895 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:2c:d0:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.896 182627 INFO nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Using config drive#033[00m
Jan 22 17:42:28 np0005592767 kernel: tapa04aec10-3d: entered promiscuous mode
Jan 22 17:42:28 np0005592767 NetworkManager[54973]: <info>  [1769121748.9799] manager: (tapa04aec10-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/254)
Jan 22 17:42:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:28Z|00544|binding|INFO|Claiming lport a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for this chassis.
Jan 22 17:42:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:28Z|00545|binding|INFO|a04aec10-3dd3-4092-868b-9f7dcd6b9f57: Claiming fa:16:3e:2c:d0:ea 10.100.0.10
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.979 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.992 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:28 np0005592767 nova_compute[182623]: 2026-01-22 22:42:28.995 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.0116] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.0128] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.016 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:d0:ea 10.100.0.10'], port_security=['fa:16:3e:2c:d0:ea 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fbc39b42-1887-45e6-ba92-560d868f205a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6f3f6f5b-27c5-4ce6-975d-b5decec6c64d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c41211b-2e23-4044-a644-f75b739a3312, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a04aec10-3dd3-4092-868b-9f7dcd6b9f57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.018 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a04aec10-3dd3-4092-868b-9f7dcd6b9f57 in datapath ee04bb14-22d4-414e-b757-5959fb8f8cee bound to our chassis#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.022 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee04bb14-22d4-414e-b757-5959fb8f8cee#033[00m
Jan 22 17:42:29 np0005592767 systemd-udevd[231906]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.036 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3c60e6ad-a90e-41a3-bf6f-c23d616e1ae4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.037 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee04bb14-21 in ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.037 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.038 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:29 np0005592767 systemd-machined[153912]: New machine qemu-71-instance-00000089.
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.040 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee04bb14-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.040 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[601927b6-dba2-463b-a558-3c16e716dca6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.042 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9b30ca-7aa7-416e-aecf-bc9d877248ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.057 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.0606] device (tapa04aec10-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.0619] device (tapa04aec10-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.065 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[cb97efe0-e733-4ae0-aef7-a5c078dffdf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 systemd[1]: Started Virtual Machine qemu-71-instance-00000089.
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.091 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[820c10d0-d0ae-452f-820f-0fa82a113fae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.122 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e400354a-1534-456b-9b17-96cc6dea2d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.1419] manager: (tapee04bb14-20): new Veth device (/org/freedesktop/NetworkManager/Devices/257)
Jan 22 17:42:29 np0005592767 systemd-udevd[231909]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.141 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f69b076f-582c-4ae7-93af-8cbb354e7310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.144 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.150 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.151 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.163 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.163 182627 INFO nova.compute.claims [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:29Z|00546|binding|INFO|Setting lport a04aec10-3dd3-4092-868b-9f7dcd6b9f57 ovn-installed in OVS
Jan 22 17:42:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:29Z|00547|binding|INFO|Setting lport a04aec10-3dd3-4092-868b-9f7dcd6b9f57 up in Southbound
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.178 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.182 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8c2a86-f1b7-4fb9-ba99-ef0e8c12ce25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.185 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a5df00d4-f7f5-400a-88fa-f494ed5d2adc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.2091] device (tapee04bb14-20): carrier: link connected
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.214 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e6d9a4-19ed-40c5-afc3-d6a48a45f147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.233 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7c5b24-838f-4ba5-b375-2cc76f8b6460]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee04bb14-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:db:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534578, 'reachable_time': 31447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231938, 'error': None, 'target': 'ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.256 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f544fcff-563c-441d-a093-54f55215a7a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:db61'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534578, 'tstamp': 534578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231939, 'error': None, 'target': 'ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.275 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8960fa-c57b-4fa3-9a93-04d47bc21652]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee04bb14-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:db:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534578, 'reachable_time': 31447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231940, 'error': None, 'target': 'ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.306 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9238ecc2-56ee-447d-aa6c-f73d1eb8f785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.354 182627 DEBUG nova.compute.provider_tree [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.375 182627 DEBUG nova.scheduler.client.report [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.377 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[237e6c24-1b03-4165-aeb2-5d0bdf1c692f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.379 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee04bb14-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.379 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.379 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee04bb14-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:29 np0005592767 NetworkManager[54973]: <info>  [1769121749.3819] manager: (tapee04bb14-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.381 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 kernel: tapee04bb14-20: entered promiscuous mode
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.386 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.387 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee04bb14-20, col_values=(('external_ids', {'iface-id': '35248d07-330e-4750-b030-11a13cc79eeb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:29Z|00548|binding|INFO|Releasing lport 35248d07-330e-4750-b030-11a13cc79eeb from this chassis (sb_readonly=0)
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.390 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.390 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee04bb14-22d4-414e-b757-5959fb8f8cee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee04bb14-22d4-414e-b757-5959fb8f8cee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.391 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7223b8b7-bfda-4dfb-b9d2-ca1364e24978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.391 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-ee04bb14-22d4-414e-b757-5959fb8f8cee
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/ee04bb14-22d4-414e-b757-5959fb8f8cee.pid.haproxy
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID ee04bb14-22d4-414e-b757-5959fb8f8cee
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:42:29 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:29.392 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'env', 'PROCESS_TAG=haproxy-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee04bb14-22d4-414e-b757-5959fb8f8cee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.415 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.417 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.471 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.472 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.494 182627 INFO nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.517 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.542 182627 DEBUG nova.compute.manager [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.543 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121749.542026, fbc39b42-1887-45e6-ba92-560d868f205a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.544 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.550 182627 INFO nova.virt.libvirt.driver [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Instance running successfully.#033[00m
Jan 22 17:42:29 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.553 182627 DEBUG nova.virt.libvirt.guest [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.554 182627 DEBUG nova.virt.libvirt.driver [None req-cb30a7cc-175d-4706-b562-e1c173dc1082 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.584 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.589 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.636 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.636 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121749.542257, fbc39b42-1887-45e6-ba92-560d868f205a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.637 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] VM Started (Lifecycle Event)#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.671 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.676 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.705 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.710 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.711 182627 INFO nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Creating image(s)#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.712 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.713 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.714 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.752 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.809 182627 DEBUG nova.policy [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.831 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.832 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.833 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.849 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:29 np0005592767 podman[231978]: 2026-01-22 22:42:29.888135462 +0000 UTC m=+0.078452955 container create 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.920 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.921 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:29 np0005592767 systemd[1]: Started libpod-conmon-5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc.scope.
Jan 22 17:42:29 np0005592767 podman[231978]: 2026-01-22 22:42:29.844159581 +0000 UTC m=+0.034477104 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.964 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.965 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:29 np0005592767 nova_compute[182623]: 2026-01-22 22:42:29.965 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:29 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:42:29 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8abfbfb77a8629bfb3d8185a472dda67a1ca472227e6be5e9aca98d80876babc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:42:30 np0005592767 podman[231978]: 2026-01-22 22:42:30.008514949 +0000 UTC m=+0.198832452 container init 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:30 np0005592767 podman[231978]: 2026-01-22 22:42:30.019117148 +0000 UTC m=+0.209434641 container start 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.028 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.029 182627 DEBUG nova.virt.disk.api [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.030 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:30 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [NOTICE]   (232006) : New worker (232011) forked
Jan 22 17:42:30 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [NOTICE]   (232006) : Loading success.
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.064 182627 DEBUG nova.network.neutron [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updated VIF entry in instance network info cache for port a04aec10-3dd3-4092-868b-9f7dcd6b9f57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.066 182627 DEBUG nova.network.neutron [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating instance_info_cache with network_info: [{"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.085 182627 DEBUG oslo_concurrency.lockutils [req-2f4d9ffe-c868-4b2d-b179-7994b515ddda req-b8337dbc-ed2e-413b-9d8f-3c2530bfc5e6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.103 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.104 182627 DEBUG nova.virt.disk.api [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.104 182627 DEBUG nova.objects.instance [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid 1571b99b-d67e-4d09-9401-3b83292a110c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.120 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.120 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Ensure instance console log exists: /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.121 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.121 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.121 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.868 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Successfully created port: 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.967 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121735.9664097, 1fc01b1b-88f4-4078-a423-704c20c2ba9d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.967 182627 INFO nova.compute.manager [-] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:42:30 np0005592767 nova_compute[182623]: 2026-01-22 22:42:30.988 182627 DEBUG nova.compute.manager [None req-f3379f3f-490a-4295-8b39-52bb0cd056d0 - - - - - -] [instance: 1fc01b1b-88f4-4078-a423-704c20c2ba9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:31 np0005592767 podman[232023]: 2026-01-22 22:42:31.183642165 +0000 UTC m=+0.093069917 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:42:31 np0005592767 nova_compute[182623]: 2026-01-22 22:42:31.592 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Successfully updated port: 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:42:31 np0005592767 nova_compute[182623]: 2026-01-22 22:42:31.610 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:31 np0005592767 nova_compute[182623]: 2026-01-22 22:42:31.610 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:31 np0005592767 nova_compute[182623]: 2026-01-22 22:42:31.610 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:42:31 np0005592767 nova_compute[182623]: 2026-01-22 22:42:31.809 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.026 182627 DEBUG nova.compute.manager [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-changed-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.026 182627 DEBUG nova.compute.manager [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Refreshing instance network info cache due to event network-changed-2a8894c0-8fbc-41a6-bd16-1335c8114cb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.027 182627 DEBUG oslo_concurrency.lockutils [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.044 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.607 182627 DEBUG nova.network.neutron [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updating instance_info_cache with network_info: [{"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.629 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.629 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Instance network_info: |[{"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.630 182627 DEBUG oslo_concurrency.lockutils [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.630 182627 DEBUG nova.network.neutron [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Refreshing network info cache for port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.633 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Start _get_guest_xml network_info=[{"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.638 182627 WARNING nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.643 182627 DEBUG nova.virt.libvirt.host [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.644 182627 DEBUG nova.virt.libvirt.host [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.651 182627 DEBUG nova.virt.libvirt.host [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.652 182627 DEBUG nova.virt.libvirt.host [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.654 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.654 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.654 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.655 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.655 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.656 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.656 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.656 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.657 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.657 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.658 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.658 182627 DEBUG nova.virt.hardware [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.662 182627 DEBUG nova.virt.libvirt.vif [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:42:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-317762552',display_name='tempest-TestNetworkBasicOps-server-317762552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-317762552',id=140,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEq0UFb1orBcIILI2RTgUaN6tHZsu/JAgV+A/ndXGjKUeGPqT8OGj0qdVoPDZlX91mdSGC+4NhILFKOEo8bHX9xg375w5QbKnFcvt4hEvi+E448mz/RNO5I0MDQsZOWZYQ==',key_name='tempest-TestNetworkBasicOps-899242006',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-lv00vdoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:42:29Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1571b99b-d67e-4d09-9401-3b83292a110c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.663 182627 DEBUG nova.network.os_vif_util [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.663 182627 DEBUG nova.network.os_vif_util [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.665 182627 DEBUG nova.objects.instance [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1571b99b-d67e-4d09-9401-3b83292a110c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.682 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <uuid>1571b99b-d67e-4d09-9401-3b83292a110c</uuid>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <name>instance-0000008c</name>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkBasicOps-server-317762552</nova:name>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:42:32</nova:creationTime>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        <nova:port uuid="2a8894c0-8fbc-41a6-bd16-1335c8114cb8">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="serial">1571b99b-d67e-4d09-9401-3b83292a110c</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="uuid">1571b99b-d67e-4d09-9401-3b83292a110c</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.config"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:1e:a5:56"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <target dev="tap2a8894c0-8f"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/console.log" append="off"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:42:32 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:42:32 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:42:32 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:42:32 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.688 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Preparing to wait for external event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.688 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.688 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.689 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.689 182627 DEBUG nova.virt.libvirt.vif [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:42:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-317762552',display_name='tempest-TestNetworkBasicOps-server-317762552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-317762552',id=140,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEq0UFb1orBcIILI2RTgUaN6tHZsu/JAgV+A/ndXGjKUeGPqT8OGj0qdVoPDZlX91mdSGC+4NhILFKOEo8bHX9xg375w5QbKnFcvt4hEvi+E448mz/RNO5I0MDQsZOWZYQ==',key_name='tempest-TestNetworkBasicOps-899242006',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-lv00vdoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:42:29Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1571b99b-d67e-4d09-9401-3b83292a110c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.690 182627 DEBUG nova.network.os_vif_util [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.690 182627 DEBUG nova.network.os_vif_util [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.691 182627 DEBUG os_vif [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.691 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.692 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.692 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.696 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a8894c0-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.697 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a8894c0-8f, col_values=(('external_ids', {'iface-id': '2a8894c0-8fbc-41a6-bd16-1335c8114cb8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a5:56', 'vm-uuid': '1571b99b-d67e-4d09-9401-3b83292a110c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:32 np0005592767 NetworkManager[54973]: <info>  [1769121752.6989] manager: (tap2a8894c0-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.698 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.702 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.708 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.709 182627 INFO os_vif [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f')#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.939 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.940 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.940 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:1e:a5:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:42:32 np0005592767 nova_compute[182623]: 2026-01-22 22:42:32.941 182627 INFO nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Using config drive#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.641 182627 INFO nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Creating config drive at /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.config#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.651 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mlu2tei execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.798 182627 DEBUG oslo_concurrency.processutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mlu2tei" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:42:33 np0005592767 kernel: tap2a8894c0-8f: entered promiscuous mode
Jan 22 17:42:33 np0005592767 NetworkManager[54973]: <info>  [1769121753.8684] manager: (tap2a8894c0-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Jan 22 17:42:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:33Z|00549|binding|INFO|Claiming lport 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 for this chassis.
Jan 22 17:42:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:33Z|00550|binding|INFO|2a8894c0-8fbc-41a6-bd16-1335c8114cb8: Claiming fa:16:3e:1e:a5:56 10.100.0.11
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.883 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a5:56 10.100.0.11'], port_security=['fa:16:3e:1e:a5:56 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1571b99b-d67e-4d09-9401-3b83292a110c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fd00beed-fd7d-454c-ae3a-3eec3e530c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ebf5952-91d3-4d6e-a145-1401e7d14d3f, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2a8894c0-8fbc-41a6-bd16-1335c8114cb8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.886 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 in datapath fd739554-520e-4e70-9045-bd1e5e1f0fe0 bound to our chassis#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.890 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd739554-520e-4e70-9045-bd1e5e1f0fe0#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.888 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:33Z|00551|binding|INFO|Setting lport 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 ovn-installed in OVS
Jan 22 17:42:33 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:33Z|00552|binding|INFO|Setting lport 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 up in Southbound
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.904 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.906 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.907 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f332a83c-312a-46b5-96b6-f4aee2826bee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.908 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd739554-51 in ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.911 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd739554-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f272a81f-cc6e-4de1-affb-4da1866d37fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d56c31d9-8595-4145-bd7a-86606d7fac37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 systemd-machined[153912]: New machine qemu-72-instance-0000008c.
Jan 22 17:42:33 np0005592767 systemd[1]: Started Virtual Machine qemu-72-instance-0000008c.
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.930 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7c4af2-b172-4ecf-8aac-ebf9c8c058cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 systemd-udevd[232067]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.948 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7370d51f-27d8-44c4-9ba1-6ef4f87dfa5a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.950 182627 DEBUG nova.compute.manager [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.950 182627 DEBUG oslo_concurrency.lockutils [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.950 182627 DEBUG oslo_concurrency.lockutils [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.950 182627 DEBUG oslo_concurrency.lockutils [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.951 182627 DEBUG nova.compute.manager [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:33 np0005592767 nova_compute[182623]: 2026-01-22 22:42:33.951 182627 WARNING nova.compute.manager [req-d6cd98ef-e0f7-4d7c-8063-beeea392a512 req-1ffbacc7-af3a-43da-b955-1cfed74aed24 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state resized and task_state None.#033[00m
Jan 22 17:42:33 np0005592767 NetworkManager[54973]: <info>  [1769121753.9589] device (tap2a8894c0-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:42:33 np0005592767 NetworkManager[54973]: <info>  [1769121753.9597] device (tap2a8894c0-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.973 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[869ce5c6-8685-433e-9f83-7d4b61f7720d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:33 np0005592767 NetworkManager[54973]: <info>  [1769121753.9797] manager: (tapfd739554-50): new Veth device (/org/freedesktop/NetworkManager/Devices/261)
Jan 22 17:42:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:33.978 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddc2932-62cb-4b5d-85e3-418459993860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.008 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[82c2bb4f-29d6-4c87-9584-5270df0a97a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.011 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[782cf1ce-64d1-431d-b5f2-d4a954f555b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 NetworkManager[54973]: <info>  [1769121754.0290] device (tapfd739554-50): carrier: link connected
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.033 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2ed08d-001a-41e9-a07e-e4084c59db00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.047 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7d2858-edc2-4283-a2a7-f15883951e29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd739554-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535060, 'reachable_time': 30568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232097, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.065 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d835c6a2-68e9-4224-aac9-df3705d80fed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:e3f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535060, 'tstamp': 535060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232098, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.077 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c08d20f2-51e5-49bf-bf04-230ffdd3de3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd739554-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:e3:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535060, 'reachable_time': 30568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232099, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.106 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84270570-e879-4f0d-afa4-d6526ec44be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.158 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b4905b8f-6253-4a9f-ba81-e43e33231f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.160 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd739554-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.160 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.160 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd739554-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:34 np0005592767 NetworkManager[54973]: <info>  [1769121754.1630] manager: (tapfd739554-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 22 17:42:34 np0005592767 kernel: tapfd739554-50: entered promiscuous mode
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.167 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd739554-50, col_values=(('external_ids', {'iface-id': '545ef9e3-da19-466a-bbee-43bcf179d362'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.169 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:34Z|00553|binding|INFO|Releasing lport 545ef9e3-da19-466a-bbee-43bcf179d362 from this chassis (sb_readonly=0)
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.173 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.175 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c036c933-08c8-42b6-aead-63d80ae64588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.177 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-fd739554-520e-4e70-9045-bd1e5e1f0fe0
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/fd739554-520e-4e70-9045-bd1e5e1f0fe0.pid.haproxy
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID fd739554-520e-4e70-9045-bd1e5e1f0fe0
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:42:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:34.177 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'env', 'PROCESS_TAG=haproxy-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd739554-520e-4e70-9045-bd1e5e1f0fe0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.182 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.228 182627 DEBUG nova.compute.manager [req-e0bb0806-a16a-4df5-bb37-fe22c685d5b2 req-752f4988-e880-4eff-80c0-7497cc5c28a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.228 182627 DEBUG oslo_concurrency.lockutils [req-e0bb0806-a16a-4df5-bb37-fe22c685d5b2 req-752f4988-e880-4eff-80c0-7497cc5c28a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.229 182627 DEBUG oslo_concurrency.lockutils [req-e0bb0806-a16a-4df5-bb37-fe22c685d5b2 req-752f4988-e880-4eff-80c0-7497cc5c28a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.229 182627 DEBUG oslo_concurrency.lockutils [req-e0bb0806-a16a-4df5-bb37-fe22c685d5b2 req-752f4988-e880-4eff-80c0-7497cc5c28a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.229 182627 DEBUG nova.compute.manager [req-e0bb0806-a16a-4df5-bb37-fe22c685d5b2 req-752f4988-e880-4eff-80c0-7497cc5c28a3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Processing event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.297 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121754.297065, 1571b99b-d67e-4d09-9401-3b83292a110c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.298 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.299 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 17:42:34 np0005592767 systemd[231816]: Activating special unit Exit the Session...
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped target Main User Target.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped target Basic System.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped target Paths.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped target Sockets.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped target Timers.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 17:42:34 np0005592767 systemd[231816]: Closed D-Bus User Message Bus Socket.
Jan 22 17:42:34 np0005592767 systemd[231816]: Stopped Create User's Volatile Files and Directories.
Jan 22 17:42:34 np0005592767 systemd[231816]: Removed slice User Application Slice.
Jan 22 17:42:34 np0005592767 systemd[231816]: Reached target Shutdown.
Jan 22 17:42:34 np0005592767 systemd[231816]: Finished Exit the Session.
Jan 22 17:42:34 np0005592767 systemd[231816]: Reached target Exit the Session.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.317 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.321 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.325 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.327 182627 INFO nova.virt.libvirt.driver [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Instance spawned successfully.#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.328 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.342 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.343 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121754.2975538, 1571b99b-d67e-4d09-9401-3b83292a110c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.343 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.350 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.351 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.351 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.352 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.352 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.353 182627 DEBUG nova.virt.libvirt.driver [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 17:42:34 np0005592767 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 17:42:34 np0005592767 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.363 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.367 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121754.3271646, 1571b99b-d67e-4d09-9401-3b83292a110c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.368 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.392 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.395 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.416 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.440 182627 INFO nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Took 4.73 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.440 182627 DEBUG nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.524 182627 INFO nova.compute.manager [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Took 5.40 seconds to build instance.#033[00m
Jan 22 17:42:34 np0005592767 podman[232139]: 2026-01-22 22:42:34.622040313 +0000 UTC m=+0.085369130 container create 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:42:34 np0005592767 podman[232139]: 2026-01-22 22:42:34.575764488 +0000 UTC m=+0.039093305 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:42:34 np0005592767 systemd[1]: Started libpod-conmon-886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243.scope.
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.693 182627 DEBUG oslo_concurrency.lockutils [None req-07f921f2-6ba1-4177-804b-8e55f737cb7e b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.695 182627 DEBUG nova.network.neutron [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updated VIF entry in instance network info cache for port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.696 182627 DEBUG nova.network.neutron [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updating instance_info_cache with network_info: [{"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:34 np0005592767 nova_compute[182623]: 2026-01-22 22:42:34.714 182627 DEBUG oslo_concurrency.lockutils [req-5db87880-ce20-4644-bafc-3303a13f6e90 req-f06d3b9f-0452-438e-9e72-f05a2f34203c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:34 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:42:34 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f04465e339dfa6d4c568e646c6271ce0cb902da1f7342d559d61251ade698b64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:42:34 np0005592767 podman[232152]: 2026-01-22 22:42:34.741052451 +0000 UTC m=+0.079145914 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:42:34 np0005592767 podman[232139]: 2026-01-22 22:42:34.741685589 +0000 UTC m=+0.205014466 container init 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:42:34 np0005592767 podman[232139]: 2026-01-22 22:42:34.751530607 +0000 UTC m=+0.214859434 container start 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:42:34 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [NOTICE]   (232194) : New worker (232199) forked
Jan 22 17:42:34 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [NOTICE]   (232194) : Loading success.
Jan 22 17:42:34 np0005592767 podman[232164]: 2026-01-22 22:42:34.842329409 +0000 UTC m=+0.132393997 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.062 182627 DEBUG nova.compute.manager [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.063 182627 DEBUG oslo_concurrency.lockutils [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.063 182627 DEBUG oslo_concurrency.lockutils [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.063 182627 DEBUG oslo_concurrency.lockutils [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.063 182627 DEBUG nova.compute.manager [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.064 182627 WARNING nova.compute.manager [req-b6dd8afe-465a-4839-a4d6-b10252352cb8 req-b50043f0-b907-42f1-bfc3-6b910502511e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.344 182627 DEBUG nova.compute.manager [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.344 182627 DEBUG oslo_concurrency.lockutils [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.345 182627 DEBUG oslo_concurrency.lockutils [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.345 182627 DEBUG oslo_concurrency.lockutils [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.345 182627 DEBUG nova.compute.manager [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] No waiting events found dispatching network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.346 182627 WARNING nova.compute.manager [req-71e8c0d4-e5fc-4352-ae95-0681ef581e1c req-667f77de-9342-4f25-9175-3d9ba7925efd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received unexpected event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:42:36 np0005592767 nova_compute[182623]: 2026-01-22 22:42:36.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:42:37 np0005592767 nova_compute[182623]: 2026-01-22 22:42:37.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:37 np0005592767 nova_compute[182623]: 2026-01-22 22:42:37.698 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:38 np0005592767 nova_compute[182623]: 2026-01-22 22:42:38.424 182627 DEBUG nova.compute.manager [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-changed-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:38 np0005592767 nova_compute[182623]: 2026-01-22 22:42:38.424 182627 DEBUG nova.compute.manager [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Refreshing instance network info cache due to event network-changed-2a8894c0-8fbc-41a6-bd16-1335c8114cb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:42:38 np0005592767 nova_compute[182623]: 2026-01-22 22:42:38.425 182627 DEBUG oslo_concurrency.lockutils [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:38 np0005592767 nova_compute[182623]: 2026-01-22 22:42:38.425 182627 DEBUG oslo_concurrency.lockutils [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:38 np0005592767 nova_compute[182623]: 2026-01-22 22:42:38.425 182627 DEBUG nova.network.neutron [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Refreshing network info cache for port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:42:39 np0005592767 nova_compute[182623]: 2026-01-22 22:42:39.802 182627 DEBUG nova.network.neutron [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updated VIF entry in instance network info cache for port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:42:39 np0005592767 nova_compute[182623]: 2026-01-22 22:42:39.804 182627 DEBUG nova.network.neutron [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updating instance_info_cache with network_info: [{"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:39 np0005592767 nova_compute[182623]: 2026-01-22 22:42:39.835 182627 DEBUG oslo_concurrency.lockutils [req-9f0ca253-81e3-49f9-8b4c-64b411b37fcd req-ada7bd31-30d1-499d-923d-a20503214e4d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1571b99b-d67e-4d09-9401-3b83292a110c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:39 np0005592767 nova_compute[182623]: 2026-01-22 22:42:39.905 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:40.661 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:40 np0005592767 nova_compute[182623]: 2026-01-22 22:42:40.661 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:40.664 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:42:41 np0005592767 nova_compute[182623]: 2026-01-22 22:42:41.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:42 np0005592767 nova_compute[182623]: 2026-01-22 22:42:42.050 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:42Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2c:d0:ea 10.100.0.10
Jan 22 17:42:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:42.667 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:42 np0005592767 nova_compute[182623]: 2026-01-22 22:42:42.726 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:43 np0005592767 podman[232225]: 2026-01-22 22:42:43.144121473 +0000 UTC m=+0.048159380 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:42:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:43.152 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:be:97 10.100.0.2 2001:db8::f816:3eff:fe2c:be97'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:be97/64', 'neutron:device_id': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b64fd7f9-9daa-4dd2-9dfa-7c863399e516, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4c584d5e-ac75-444a-b20c-05a59b075ca2) old=Port_Binding(mac=['fa:16:3e:2c:be:97 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:43.153 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4c584d5e-ac75-444a-b20c-05a59b075ca2 in datapath 3676296d-a568-47ea-b6cb-2ef8aff27f14 updated#033[00m
Jan 22 17:42:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:43.155 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3676296d-a568-47ea-b6cb-2ef8aff27f14, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:43.156 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[afc92d64-12a3-4b18-9eb2-fd8266774ac4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:43 np0005592767 podman[232226]: 2026-01-22 22:42:43.170740784 +0000 UTC m=+0.086226794 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:42:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:46Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:a5:56 10.100.0.11
Jan 22 17:42:46 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:46Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a5:56 10.100.0.11
Jan 22 17:42:47 np0005592767 nova_compute[182623]: 2026-01-22 22:42:47.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:47 np0005592767 nova_compute[182623]: 2026-01-22 22:42:47.728 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:48 np0005592767 nova_compute[182623]: 2026-01-22 22:42:48.404 182627 INFO nova.compute.manager [None req-e9ad6138-6024-4f41-839f-beca20fa4c91 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Get console output#033[00m
Jan 22 17:42:48 np0005592767 nova_compute[182623]: 2026-01-22 22:42:48.412 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.814 182627 DEBUG nova.compute.manager [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-changed-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.814 182627 DEBUG nova.compute.manager [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Refreshing instance network info cache due to event network-changed-a04aec10-3dd3-4092-868b-9f7dcd6b9f57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.814 182627 DEBUG oslo_concurrency.lockutils [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.815 182627 DEBUG oslo_concurrency.lockutils [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.815 182627 DEBUG nova.network.neutron [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Refreshing network info cache for port a04aec10-3dd3-4092-868b-9f7dcd6b9f57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.994 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.995 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.995 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.995 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:49 np0005592767 nova_compute[182623]: 2026-01-22 22:42:49.996 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.007 182627 INFO nova.compute.manager [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Terminating instance#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.017 182627 DEBUG nova.compute.manager [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:42:50 np0005592767 kernel: tapa04aec10-3d (unregistering): left promiscuous mode
Jan 22 17:42:50 np0005592767 NetworkManager[54973]: <info>  [1769121770.0383] device (tapa04aec10-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:50Z|00554|binding|INFO|Releasing lport a04aec10-3dd3-4092-868b-9f7dcd6b9f57 from this chassis (sb_readonly=0)
Jan 22 17:42:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:50Z|00555|binding|INFO|Setting lport a04aec10-3dd3-4092-868b-9f7dcd6b9f57 down in Southbound
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:50Z|00556|binding|INFO|Removing iface tapa04aec10-3d ovn-installed in OVS
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.048 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:d0:ea 10.100.0.10'], port_security=['fa:16:3e:2c:d0:ea 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fbc39b42-1887-45e6-ba92-560d868f205a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6f3f6f5b-27c5-4ce6-975d-b5decec6c64d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c41211b-2e23-4044-a644-f75b739a3312, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a04aec10-3dd3-4092-868b-9f7dcd6b9f57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.050 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a04aec10-3dd3-4092-868b-9f7dcd6b9f57 in datapath ee04bb14-22d4-414e-b757-5959fb8f8cee unbound from our chassis#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.052 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee04bb14-22d4-414e-b757-5959fb8f8cee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.053 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ae87cf-845e-4fc1-9cb0-3993db205eb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.054 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee namespace which is not needed anymore#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.077 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Deactivated successfully.
Jan 22 17:42:50 np0005592767 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Consumed 12.483s CPU time.
Jan 22 17:42:50 np0005592767 systemd-machined[153912]: Machine qemu-71-instance-00000089 terminated.
Jan 22 17:42:50 np0005592767 podman[232282]: 2026-01-22 22:42:50.160272761 +0000 UTC m=+0.070131159 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [NOTICE]   (232006) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [NOTICE]   (232006) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [WARNING]  (232006) : Exiting Master process...
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [WARNING]  (232006) : Exiting Master process...
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [ALERT]    (232006) : Current worker (232011) exited with code 143 (Terminated)
Jan 22 17:42:50 np0005592767 neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee[231999]: [WARNING]  (232006) : All workers exited. Exiting... (0)
Jan 22 17:42:50 np0005592767 systemd[1]: libpod-5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc.scope: Deactivated successfully.
Jan 22 17:42:50 np0005592767 podman[232320]: 2026-01-22 22:42:50.21159241 +0000 UTC m=+0.053114700 container died 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:42:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc-userdata-shm.mount: Deactivated successfully.
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.241 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8abfbfb77a8629bfb3d8185a472dda67a1ca472227e6be5e9aca98d80876babc-merged.mount: Deactivated successfully.
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.247 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 podman[232320]: 2026-01-22 22:42:50.256816566 +0000 UTC m=+0.098338856 container cleanup 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:42:50 np0005592767 systemd[1]: libpod-conmon-5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc.scope: Deactivated successfully.
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.287 182627 INFO nova.virt.libvirt.driver [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Instance destroyed successfully.#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.287 182627 DEBUG nova.objects.instance [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid fbc39b42-1887-45e6-ba92-560d868f205a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.301 182627 DEBUG nova.virt.libvirt.vif [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:41:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1231100165',display_name='tempest-TestNetworkAdvancedServerOps-server-1231100165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1231100165',id=137,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF0Q2rEqFYnzPDWz7KNBg99UGu3qTb9IVDNiNmsD9om8SfswfoGYDys6DfRzAtSWD6o6yQdZOshqf9KTS3NVI6woZ/k6+5rtAFCDguLkEfqWMp6b8PEmaWSl9Coz34h48w==',key_name='tempest-TestNetworkAdvancedServerOps-143414390',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:42:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-74uvtc7y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:34Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=fbc39b42-1887-45e6-ba92-560d868f205a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.302 182627 DEBUG nova.network.os_vif_util [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.302 182627 DEBUG nova.network.os_vif_util [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.303 182627 DEBUG os_vif [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.305 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.305 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa04aec10-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.307 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.308 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.311 182627 INFO os_vif [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d0:ea,bridge_name='br-int',has_traffic_filtering=True,id=a04aec10-3dd3-4092-868b-9f7dcd6b9f57,network=Network(ee04bb14-22d4-414e-b757-5959fb8f8cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa04aec10-3d')#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.312 182627 INFO nova.virt.libvirt.driver [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Deleting instance files /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a_del#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.320 182627 INFO nova.virt.libvirt.driver [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Deletion of /var/lib/nova/instances/fbc39b42-1887-45e6-ba92-560d868f205a_del complete#033[00m
Jan 22 17:42:50 np0005592767 podman[232369]: 2026-01-22 22:42:50.329916898 +0000 UTC m=+0.049235820 container remove 5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.335 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e027f56c-8785-4309-9241-14e9b2e3e7d7]: (4, ('Thu Jan 22 10:42:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee (5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc)\n5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc\nThu Jan 22 10:42:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee (5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc)\n5d1e0950d0fb9ce94c5af8783ae1f00702b8cf94e5a893dd0a38b31dae18befc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.337 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf71b55-5bee-4cf3-bf9d-878d411bd905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.338 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee04bb14-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.340 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 kernel: tapee04bb14-20: left promiscuous mode
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.342 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.345 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[53a2af85-5cf1-47cd-9ddf-73bc01362088]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.355 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.368 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[05a2f9d1-7a02-4bdf-9083-2d62829be5be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.370 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[50986c5d-cbab-4f60-af7b-4c165a2e2488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.386 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6b9575-3d0b-4176-ae6a-cca242fb8896]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534569, 'reachable_time': 34052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232385, 'error': None, 'target': 'ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.389 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee04bb14-22d4-414e-b757-5959fb8f8cee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:42:50 np0005592767 systemd[1]: run-netns-ovnmeta\x2dee04bb14\x2d22d4\x2d414e\x2db757\x2d5959fb8f8cee.mount: Deactivated successfully.
Jan 22 17:42:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:50.389 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ee1a73-5d66-4fdf-917a-080b347a468b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.413 182627 INFO nova.compute.manager [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.413 182627 DEBUG oslo.service.loopingcall [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.414 182627 DEBUG nova.compute.manager [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:42:50 np0005592767 nova_compute[182623]: 2026-01-22 22:42:50.414 182627 DEBUG nova.network.neutron [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.311 182627 DEBUG nova.network.neutron [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updated VIF entry in instance network info cache for port a04aec10-3dd3-4092-868b-9f7dcd6b9f57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.312 182627 DEBUG nova.network.neutron [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating instance_info_cache with network_info: [{"id": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "address": "fa:16:3e:2c:d0:ea", "network": {"id": "ee04bb14-22d4-414e-b757-5959fb8f8cee", "bridge": "br-int", "label": "tempest-network-smoke--1458626390", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa04aec10-3d", "ovs_interfaceid": "a04aec10-3dd3-4092-868b-9f7dcd6b9f57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.346 182627 DEBUG oslo_concurrency.lockutils [req-c55424ed-c82a-4348-9597-c8afd64696f8 req-30e8c6c2-c2e3-45ae-a9ea-0b26b0bf9a6b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-fbc39b42-1887-45e6-ba92-560d868f205a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.588 182627 DEBUG nova.network.neutron [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.635 182627 INFO nova.compute.manager [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Took 1.22 seconds to deallocate network for instance.#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.669 182627 DEBUG nova.compute.manager [req-eba2c3cf-40b8-48ae-8ced-3bf7031d7460 req-4a962995-3021-4243-9ee6-318ec5e64662 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-deleted-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.763 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.764 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.856 182627 DEBUG nova.compute.provider_tree [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.875 182627 DEBUG nova.scheduler.client.report [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.892 182627 DEBUG nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.893 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.894 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.894 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.895 182627 DEBUG nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.895 182627 WARNING nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-unplugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.895 182627 DEBUG nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.896 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.896 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.897 182627 DEBUG oslo_concurrency.lockutils [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.897 182627 DEBUG nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] No waiting events found dispatching network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.898 182627 WARNING nova.compute.manager [req-16058c77-de70-4aad-9d98-24d6f9702af7 req-4d98b1b5-e622-4b2e-ae86-719c2f03107b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Received unexpected event network-vif-plugged-a04aec10-3dd3-4092-868b-9f7dcd6b9f57 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.901 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:51 np0005592767 nova_compute[182623]: 2026-01-22 22:42:51.928 182627 INFO nova.scheduler.client.report [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance fbc39b42-1887-45e6-ba92-560d868f205a#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.015 182627 DEBUG oslo_concurrency.lockutils [None req-7d1240e0-36b5-4450-bd7e-4459db5a1290 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "fbc39b42-1887-45e6-ba92-560d868f205a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.054 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.529 182627 INFO nova.compute.manager [None req-9e9824b3-9810-4946-82ef-eacf939cb291 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Get console output#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.534 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.878 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.878 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.879 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.879 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.879 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.889 182627 INFO nova.compute.manager [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Terminating instance#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.897 182627 DEBUG nova.compute.manager [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:42:52 np0005592767 kernel: tap2a8894c0-8f (unregistering): left promiscuous mode
Jan 22 17:42:52 np0005592767 NetworkManager[54973]: <info>  [1769121772.9230] device (tap2a8894c0-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:42:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:52Z|00557|binding|INFO|Releasing lport 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 from this chassis (sb_readonly=0)
Jan 22 17:42:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:52Z|00558|binding|INFO|Setting lport 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 down in Southbound
Jan 22 17:42:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:42:52Z|00559|binding|INFO|Removing iface tap2a8894c0-8f ovn-installed in OVS
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.928 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.931 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:52.936 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a5:56 10.100.0.11'], port_security=['fa:16:3e:1e:a5:56 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1571b99b-d67e-4d09-9401-3b83292a110c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fd00beed-fd7d-454c-ae3a-3eec3e530c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ebf5952-91d3-4d6e-a145-1401e7d14d3f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2a8894c0-8fbc-41a6-bd16-1335c8114cb8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:42:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:52.938 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2a8894c0-8fbc-41a6-bd16-1335c8114cb8 in datapath fd739554-520e-4e70-9045-bd1e5e1f0fe0 unbound from our chassis#033[00m
Jan 22 17:42:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:52.939 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd739554-520e-4e70-9045-bd1e5e1f0fe0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:42:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:52.940 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3028dfea-fd4a-4170-a6b6-e43667ca5b7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:52.941 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 namespace which is not needed anymore#033[00m
Jan 22 17:42:52 np0005592767 nova_compute[182623]: 2026-01-22 22:42:52.951 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:52 np0005592767 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Jan 22 17:42:52 np0005592767 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008c.scope: Consumed 12.714s CPU time.
Jan 22 17:42:52 np0005592767 systemd-machined[153912]: Machine qemu-72-instance-0000008c terminated.
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [NOTICE]   (232194) : haproxy version is 2.8.14-c23fe91
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [NOTICE]   (232194) : path to executable is /usr/sbin/haproxy
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [WARNING]  (232194) : Exiting Master process...
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [WARNING]  (232194) : Exiting Master process...
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [ALERT]    (232194) : Current worker (232199) exited with code 143 (Terminated)
Jan 22 17:42:53 np0005592767 neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0[232165]: [WARNING]  (232194) : All workers exited. Exiting... (0)
Jan 22 17:42:53 np0005592767 systemd[1]: libpod-886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243.scope: Deactivated successfully.
Jan 22 17:42:53 np0005592767 podman[232408]: 2026-01-22 22:42:53.083916934 +0000 UTC m=+0.051133773 container died 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:42:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243-userdata-shm.mount: Deactivated successfully.
Jan 22 17:42:53 np0005592767 NetworkManager[54973]: <info>  [1769121773.1236] manager: (tap2a8894c0-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Jan 22 17:42:53 np0005592767 systemd[1]: var-lib-containers-storage-overlay-f04465e339dfa6d4c568e646c6271ce0cb902da1f7342d559d61251ade698b64-merged.mount: Deactivated successfully.
Jan 22 17:42:53 np0005592767 podman[232408]: 2026-01-22 22:42:53.13336732 +0000 UTC m=+0.100584119 container cleanup 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:42:53 np0005592767 systemd[1]: libpod-conmon-886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243.scope: Deactivated successfully.
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.162 182627 INFO nova.virt.libvirt.driver [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Instance destroyed successfully.#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.163 182627 DEBUG nova.objects.instance [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid 1571b99b-d67e-4d09-9401-3b83292a110c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.173 182627 DEBUG nova.virt.libvirt.vif [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:42:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-317762552',display_name='tempest-TestNetworkBasicOps-server-317762552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-317762552',id=140,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEq0UFb1orBcIILI2RTgUaN6tHZsu/JAgV+A/ndXGjKUeGPqT8OGj0qdVoPDZlX91mdSGC+4NhILFKOEo8bHX9xg375w5QbKnFcvt4hEvi+E448mz/RNO5I0MDQsZOWZYQ==',key_name='tempest-TestNetworkBasicOps-899242006',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:42:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-lv00vdoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:42:34Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1571b99b-d67e-4d09-9401-3b83292a110c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.174 182627 DEBUG nova.network.os_vif_util [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "address": "fa:16:3e:1e:a5:56", "network": {"id": "fd739554-520e-4e70-9045-bd1e5e1f0fe0", "bridge": "br-int", "label": "tempest-network-smoke--1495827212", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a8894c0-8f", "ovs_interfaceid": "2a8894c0-8fbc-41a6-bd16-1335c8114cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.174 182627 DEBUG nova.network.os_vif_util [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.175 182627 DEBUG os_vif [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.176 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a8894c0-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.177 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.182 182627 INFO os_vif [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a5:56,bridge_name='br-int',has_traffic_filtering=True,id=2a8894c0-8fbc-41a6-bd16-1335c8114cb8,network=Network(fd739554-520e-4e70-9045-bd1e5e1f0fe0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a8894c0-8f')#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.182 182627 INFO nova.virt.libvirt.driver [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Deleting instance files /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c_del#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.183 182627 INFO nova.virt.libvirt.driver [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Deletion of /var/lib/nova/instances/1571b99b-d67e-4d09-9401-3b83292a110c_del complete#033[00m
Jan 22 17:42:53 np0005592767 podman[232440]: 2026-01-22 22:42:53.212460091 +0000 UTC m=+0.043926920 container remove 886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.217 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5a324697-0a36-4f71-9f64-8e8ee27d4451]: (4, ('Thu Jan 22 10:42:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 (886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243)\n886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243\nThu Jan 22 10:42:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 (886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243)\n886f96981db1d5e977a14b6404666ed883110e6ffb9fbb01596f8ba1229c1243\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.219 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[390633b5-4d6e-4e8b-97f0-35b00ac7ece8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.220 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd739554-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.222 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:53 np0005592767 kernel: tapfd739554-50: left promiscuous mode
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.234 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.238 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa5c481-dd0d-4e32-bdc0-a147d9725f94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.250 182627 INFO nova.compute.manager [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.250 182627 DEBUG oslo.service.loopingcall [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.250 182627 DEBUG nova.compute.manager [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.251 182627 DEBUG nova.network.neutron [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.250 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd9877a-90f4-438d-8605-47138fb19015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.252 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee18916-74f9-4d85-b3af-13f0f581e57b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.266 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[306f4669-b19d-449d-b43b-798fedc6378e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535054, 'reachable_time': 36668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232468, 'error': None, 'target': 'ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 systemd[1]: run-netns-ovnmeta\x2dfd739554\x2d520e\x2d4e70\x2d9045\x2dbd1e5e1f0fe0.mount: Deactivated successfully.
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.271 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd739554-520e-4e70-9045-bd1e5e1f0fe0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:42:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:42:53.271 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[b0685c87-087f-455e-8d7b-dfdf777361ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.296 182627 DEBUG nova.compute.manager [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-unplugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.297 182627 DEBUG oslo_concurrency.lockutils [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.297 182627 DEBUG oslo_concurrency.lockutils [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.297 182627 DEBUG oslo_concurrency.lockutils [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.298 182627 DEBUG nova.compute.manager [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] No waiting events found dispatching network-vif-unplugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:53 np0005592767 nova_compute[182623]: 2026-01-22 22:42:53.298 182627 DEBUG nova.compute.manager [req-76981da9-e5c6-49a1-8c47-580b58eba994 req-78832a79-f5bb-4df7-b5d8-e27f171620eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-unplugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.311 182627 DEBUG nova.network.neutron [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.331 182627 INFO nova.compute.manager [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Took 1.08 seconds to deallocate network for instance.#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.418 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.419 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.483 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.485 182627 DEBUG nova.compute.provider_tree [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.503 182627 DEBUG nova.scheduler.client.report [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.532 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.556 182627 INFO nova.scheduler.client.report [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance 1571b99b-d67e-4d09-9401-3b83292a110c#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.664 182627 DEBUG oslo_concurrency.lockutils [None req-757ecb79-893a-493b-9dc1-3bbe8a0ec991 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:54 np0005592767 nova_compute[182623]: 2026-01-22 22:42:54.669 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.458 182627 DEBUG nova.compute.manager [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.459 182627 DEBUG oslo_concurrency.lockutils [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.459 182627 DEBUG oslo_concurrency.lockutils [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.459 182627 DEBUG oslo_concurrency.lockutils [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1571b99b-d67e-4d09-9401-3b83292a110c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.460 182627 DEBUG nova.compute.manager [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] No waiting events found dispatching network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.460 182627 WARNING nova.compute.manager [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received unexpected event network-vif-plugged-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:42:55 np0005592767 nova_compute[182623]: 2026-01-22 22:42:55.460 182627 DEBUG nova.compute.manager [req-8d35ac4d-e3f8-45a7-8339-5e079d3e5682 req-cb0774fd-a9ea-4630-865b-dfc9435821bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Received event network-vif-deleted-2a8894c0-8fbc-41a6-bd16-1335c8114cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:42:57 np0005592767 nova_compute[182623]: 2026-01-22 22:42:57.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:42:58 np0005592767 nova_compute[182623]: 2026-01-22 22:42:58.179 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:02 np0005592767 nova_compute[182623]: 2026-01-22 22:43:02.058 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:02 np0005592767 podman[232470]: 2026-01-22 22:43:02.159148683 +0000 UTC m=+0.067771003 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 17:43:03 np0005592767 nova_compute[182623]: 2026-01-22 22:43:03.182 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:05 np0005592767 podman[232491]: 2026-01-22 22:43:05.237270144 +0000 UTC m=+0.137041387 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 22 17:43:05 np0005592767 podman[232490]: 2026-01-22 22:43:05.248560353 +0000 UTC m=+0.151554167 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:43:05 np0005592767 nova_compute[182623]: 2026-01-22 22:43:05.286 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121770.284666, fbc39b42-1887-45e6-ba92-560d868f205a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:05 np0005592767 nova_compute[182623]: 2026-01-22 22:43:05.286 182627 INFO nova.compute.manager [-] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:43:05 np0005592767 nova_compute[182623]: 2026-01-22 22:43:05.309 182627 DEBUG nova.compute.manager [None req-057f1c3f-47d9-4861-9e3c-658a5b8d7598 - - - - - -] [instance: fbc39b42-1887-45e6-ba92-560d868f205a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:07 np0005592767 nova_compute[182623]: 2026-01-22 22:43:07.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:08 np0005592767 nova_compute[182623]: 2026-01-22 22:43:08.161 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121773.1598237, 1571b99b-d67e-4d09-9401-3b83292a110c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:08 np0005592767 nova_compute[182623]: 2026-01-22 22:43:08.162 182627 INFO nova.compute.manager [-] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:43:08 np0005592767 nova_compute[182623]: 2026-01-22 22:43:08.187 182627 DEBUG nova.compute.manager [None req-eedae850-d862-44ed-8657-10709558e839 - - - - - -] [instance: 1571b99b-d67e-4d09-9401-3b83292a110c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:08 np0005592767 nova_compute[182623]: 2026-01-22 22:43:08.187 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.412 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.413 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.431 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.522 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.523 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.533 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.533 182627 INFO nova.compute.claims [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.653 182627 DEBUG nova.compute.provider_tree [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.666 182627 DEBUG nova.scheduler.client.report [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.686 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.686 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.764 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.765 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.786 182627 INFO nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.809 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.953 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.956 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.957 182627 INFO nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Creating image(s)#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.958 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.959 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.960 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:11 np0005592767 nova_compute[182623]: 2026-01-22 22:43:11.986 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.025 182627 DEBUG nova.policy [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2e4a08822be74c54910dd775e0f6c988', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2cf32a9545e341c0b2eaf899f8803b45', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.087 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.088 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.089 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.112 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:12.114 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.201 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.204 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.630 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Successfully created port: c21fb745-ced9-4786-be32-0bb9c6cb73eb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.834 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk 1073741824" returned: 0 in 0.630s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.835 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.835 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.888 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.890 182627 DEBUG nova.virt.disk.api [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Checking if we can resize image /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.890 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.944 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.945 182627 DEBUG nova.virt.disk.api [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Cannot resize image /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.945 182627 DEBUG nova.objects.instance [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lazy-loading 'migration_context' on Instance uuid 73618954-39df-4c9d-b2ed-36e51779ac81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.969 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.969 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Ensure instance console log exists: /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.970 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.970 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:12 np0005592767 nova_compute[182623]: 2026-01-22 22:43:12.970 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.190 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.655 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Successfully updated port: c21fb745-ced9-4786-be32-0bb9c6cb73eb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.669 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.669 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquired lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.670 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.781 182627 DEBUG nova.compute.manager [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-changed-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.782 182627 DEBUG nova.compute.manager [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Refreshing instance network info cache due to event network-changed-c21fb745-ced9-4786-be32-0bb9c6cb73eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.782 182627 DEBUG oslo_concurrency.lockutils [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:13 np0005592767 nova_compute[182623]: 2026-01-22 22:43:13.864 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:43:14 np0005592767 podman[232550]: 2026-01-22 22:43:14.156998454 +0000 UTC m=+0.066792396 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:43:14 np0005592767 podman[232551]: 2026-01-22 22:43:14.17315349 +0000 UTC m=+0.079591447 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.033 182627 DEBUG nova.network.neutron [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Updating instance_info_cache with network_info: [{"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.048 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Releasing lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.048 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Instance network_info: |[{"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.048 182627 DEBUG oslo_concurrency.lockutils [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.049 182627 DEBUG nova.network.neutron [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Refreshing network info cache for port c21fb745-ced9-4786-be32-0bb9c6cb73eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.052 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Start _get_guest_xml network_info=[{"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.057 182627 WARNING nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.066 182627 DEBUG nova.virt.libvirt.host [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.067 182627 DEBUG nova.virt.libvirt.host [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.071 182627 DEBUG nova.virt.libvirt.host [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.072 182627 DEBUG nova.virt.libvirt.host [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.074 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.074 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.075 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.075 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.075 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.075 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.076 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.076 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.076 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.077 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.077 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.077 182627 DEBUG nova.virt.hardware [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.082 182627 DEBUG nova.virt.libvirt.vif [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-867007571',display_name='tempest-ServerPasswordTestJSON-server-867007571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-867007571',id=143,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2cf32a9545e341c0b2eaf899f8803b45',ramdisk_id='',reservation_id='r-54sdmof3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-2051664262',owner_user_name='tempest-ServerPasswordTestJSON-2051664262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:11Z,user_data=None,user_id='2e4a08822be74c54910dd775e0f6c988',uuid=73618954-39df-4c9d-b2ed-36e51779ac81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.082 182627 DEBUG nova.network.os_vif_util [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converting VIF {"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.083 182627 DEBUG nova.network.os_vif_util [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.084 182627 DEBUG nova.objects.instance [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lazy-loading 'pci_devices' on Instance uuid 73618954-39df-4c9d-b2ed-36e51779ac81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.096 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <uuid>73618954-39df-4c9d-b2ed-36e51779ac81</uuid>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <name>instance-0000008f</name>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerPasswordTestJSON-server-867007571</nova:name>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:43:15</nova:creationTime>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:user uuid="2e4a08822be74c54910dd775e0f6c988">tempest-ServerPasswordTestJSON-2051664262-project-member</nova:user>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:project uuid="2cf32a9545e341c0b2eaf899f8803b45">tempest-ServerPasswordTestJSON-2051664262</nova:project>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        <nova:port uuid="c21fb745-ced9-4786-be32-0bb9c6cb73eb">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="serial">73618954-39df-4c9d-b2ed-36e51779ac81</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="uuid">73618954-39df-4c9d-b2ed-36e51779ac81</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.config"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:f4:d2:3c"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <target dev="tapc21fb745-ce"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/console.log" append="off"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:43:15 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:43:15 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:43:15 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:43:15 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.097 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Preparing to wait for external event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.098 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.098 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.099 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.100 182627 DEBUG nova.virt.libvirt.vif [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-867007571',display_name='tempest-ServerPasswordTestJSON-server-867007571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-867007571',id=143,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2cf32a9545e341c0b2eaf899f8803b45',ramdisk_id='',reservation_id='r-54sdmof3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-2051664262',owner_user_name='tempest-ServerPasswordTestJSON-2051664262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:11Z,user_data=None,user_id='2e4a08822be74c54910dd775e0f6c988',uuid=73618954-39df-4c9d-b2ed-36e51779ac81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.100 182627 DEBUG nova.network.os_vif_util [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converting VIF {"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.101 182627 DEBUG nova.network.os_vif_util [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.102 182627 DEBUG os_vif [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.103 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.104 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.107 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc21fb745-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.107 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc21fb745-ce, col_values=(('external_ids', {'iface-id': 'c21fb745-ced9-4786-be32-0bb9c6cb73eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:d2:3c', 'vm-uuid': '73618954-39df-4c9d-b2ed-36e51779ac81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 NetworkManager[54973]: <info>  [1769121795.1097] manager: (tapc21fb745-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.115 182627 INFO os_vif [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce')#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.169 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.169 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.169 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] No VIF found with MAC fa:16:3e:f4:d2:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.170 182627 INFO nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Using config drive#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.689 182627 INFO nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Creating config drive at /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.config#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.694 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7y33osto execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.833 182627 DEBUG oslo_concurrency.processutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7y33osto" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:15 np0005592767 kernel: tapc21fb745-ce: entered promiscuous mode
Jan 22 17:43:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:15Z|00560|binding|INFO|Claiming lport c21fb745-ced9-4786-be32-0bb9c6cb73eb for this chassis.
Jan 22 17:43:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:15Z|00561|binding|INFO|c21fb745-ced9-4786-be32-0bb9c6cb73eb: Claiming fa:16:3e:f4:d2:3c 10.100.0.13
Jan 22 17:43:15 np0005592767 NetworkManager[54973]: <info>  [1769121795.9050] manager: (tapc21fb745-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.904 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.911 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.922 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:d2:3c 10.100.0.13'], port_security=['fa:16:3e:f4:d2:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '73618954-39df-4c9d-b2ed-36e51779ac81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cf32a9545e341c0b2eaf899f8803b45', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35bdae83-feb5-4e1c-aca1-42de2609ae93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b2c9472-340c-4b83-8804-cbd0bdffcb05, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c21fb745-ced9-4786-be32-0bb9c6cb73eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.923 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c21fb745-ced9-4786-be32-0bb9c6cb73eb in datapath 80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 bound to our chassis#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.925 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 80af5ceb-00b1-4fd9-a134-f5a0f3dd4933#033[00m
Jan 22 17:43:15 np0005592767 systemd-udevd[232611]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:15 np0005592767 systemd-machined[153912]: New machine qemu-73-instance-0000008f.
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.937 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2724f4e4-914b-4607-9479-548c5f05602d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.938 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap80af5ceb-01 in ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:43:15 np0005592767 NetworkManager[54973]: <info>  [1769121795.9412] device (tapc21fb745-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.941 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap80af5ceb-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:43:15 np0005592767 NetworkManager[54973]: <info>  [1769121795.9419] device (tapc21fb745-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.941 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b4d9af-a3c3-4699-ac75-0b1658a54b2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.942 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4aa91f-d635-43dd-b9dc-641ba2ed01f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.953 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[059e8f18-7650-410e-b03f-c78a067d2d35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:15Z|00562|binding|INFO|Setting lport c21fb745-ced9-4786-be32-0bb9c6cb73eb ovn-installed in OVS
Jan 22 17:43:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:15Z|00563|binding|INFO|Setting lport c21fb745-ced9-4786-be32-0bb9c6cb73eb up in Southbound
Jan 22 17:43:15 np0005592767 nova_compute[182623]: 2026-01-22 22:43:15.963 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:15 np0005592767 systemd[1]: Started Virtual Machine qemu-73-instance-0000008f.
Jan 22 17:43:15 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:15.975 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03724a3b-72eb-45a5-b7f1-e0b268c78701]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.007 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[b4140818-434a-4c63-a25e-f44ea62bc049]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.013 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c17bba65-3137-46df-80d5-6b60c4decd6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 NetworkManager[54973]: <info>  [1769121796.0140] manager: (tap80af5ceb-00): new Veth device (/org/freedesktop/NetworkManager/Devices/266)
Jan 22 17:43:16 np0005592767 systemd-udevd[232615]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.043 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ebe7d8-f8e8-443f-be25-89560fed9fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.047 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[312e590a-8731-44bd-b842-5f4f8de8f2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 NetworkManager[54973]: <info>  [1769121796.0697] device (tap80af5ceb-00): carrier: link connected
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.075 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc8a996-53c9-4f0d-8c83-16193e28d0a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.095 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[771c64fe-af48-4f07-8099-459d1f4fdec4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80af5ceb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:64:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539264, 'reachable_time': 17435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232645, 'error': None, 'target': 'ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.110 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[96b793fe-41c9-4fe7-8707-77cb8114c7fc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:64d4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539264, 'tstamp': 539264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232646, 'error': None, 'target': 'ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.127 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[96b2e51a-5247-4d15-bb4f-5cf8c4c2b89f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap80af5ceb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:64:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539264, 'reachable_time': 17435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232647, 'error': None, 'target': 'ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.146 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.146 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.161 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0597dfdb-7dcd-4c87-b2a1-64c4033776c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.166 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.213 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c63216e1-8a11-46fe-aa97-e832f6297934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.214 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80af5ceb-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.215 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.215 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80af5ceb-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.216 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:16 np0005592767 NetworkManager[54973]: <info>  [1769121796.2171] manager: (tap80af5ceb-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Jan 22 17:43:16 np0005592767 kernel: tap80af5ceb-00: entered promiscuous mode
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.220 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap80af5ceb-00, col_values=(('external_ids', {'iface-id': '7f97dbcf-88cf-4744-b8de-b9a470fd6923'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:16 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:16Z|00564|binding|INFO|Releasing lport 7f97dbcf-88cf-4744-b8de-b9a470fd6923 from this chassis (sb_readonly=0)
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.221 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.223 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/80af5ceb-00b1-4fd9-a134-f5a0f3dd4933.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/80af5ceb-00b1-4fd9-a134-f5a0f3dd4933.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.224 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddea55d-fefb-495b-8c19-696b1d2f62f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.224 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/80af5ceb-00b1-4fd9-a134-f5a0f3dd4933.pid.haproxy
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 80af5ceb-00b1-4fd9-a134-f5a0f3dd4933
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:43:16 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:16.225 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'env', 'PROCESS_TAG=haproxy-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/80af5ceb-00b1-4fd9-a134-f5a0f3dd4933.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.229 182627 DEBUG nova.compute.manager [req-afafe431-e4ab-4e93-96bc-d35c00cdbe63 req-d58a3704-6120-4dcf-b3eb-597cdde4e298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.229 182627 DEBUG oslo_concurrency.lockutils [req-afafe431-e4ab-4e93-96bc-d35c00cdbe63 req-d58a3704-6120-4dcf-b3eb-597cdde4e298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.229 182627 DEBUG oslo_concurrency.lockutils [req-afafe431-e4ab-4e93-96bc-d35c00cdbe63 req-d58a3704-6120-4dcf-b3eb-597cdde4e298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.230 182627 DEBUG oslo_concurrency.lockutils [req-afafe431-e4ab-4e93-96bc-d35c00cdbe63 req-d58a3704-6120-4dcf-b3eb-597cdde4e298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.230 182627 DEBUG nova.compute.manager [req-afafe431-e4ab-4e93-96bc-d35c00cdbe63 req-d58a3704-6120-4dcf-b3eb-597cdde4e298 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Processing event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.233 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.249 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121796.2493773, 73618954-39df-4c9d-b2ed-36e51779ac81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.250 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] VM Started (Lifecycle Event)#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.251 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.256 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.259 182627 INFO nova.virt.libvirt.driver [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Instance spawned successfully.#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.260 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.266 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.266 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.272 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.273 182627 INFO nova.compute.claims [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.277 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.284 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.288 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.288 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.289 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.289 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.290 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.290 182627 DEBUG nova.virt.libvirt.driver [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.320 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.321 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121796.250041, 73618954-39df-4c9d-b2ed-36e51779ac81 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.321 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.354 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.357 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121796.2540784, 73618954-39df-4c9d-b2ed-36e51779ac81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.357 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.385 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.394 182627 INFO nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Took 4.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.394 182627 DEBUG nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.399 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.443 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.480 182627 DEBUG nova.compute.provider_tree [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.492 182627 INFO nova.compute.manager [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Took 5.01 seconds to build instance.#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.505 182627 DEBUG nova.scheduler.client.report [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.509 182627 DEBUG oslo_concurrency.lockutils [None req-acd03cc8-38f0-4263-97ca-35d2c1973d23 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.522 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.523 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.575 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.575 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.589 182627 DEBUG nova.network.neutron [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Updated VIF entry in instance network info cache for port c21fb745-ced9-4786-be32-0bb9c6cb73eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.589 182627 DEBUG nova.network.neutron [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Updating instance_info_cache with network_info: [{"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.591 182627 INFO nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:43:16 np0005592767 podman[232685]: 2026-01-22 22:43:16.594565943 +0000 UTC m=+0.053018667 container create 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.609 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.619 182627 DEBUG oslo_concurrency.lockutils [req-473cf876-97e6-40f2-bb2f-3c42ca8eeb92 req-299c1db4-c1cc-4ae2-8a70-e84e17432f8d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-73618954-39df-4c9d-b2ed-36e51779ac81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:16 np0005592767 systemd[1]: Started libpod-conmon-1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8.scope.
Jan 22 17:43:16 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:43:16 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aed42e562144efe551b4d7658e579f56ddd34121191c7ce0e139d7b57f34cfa9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:43:16 np0005592767 podman[232685]: 2026-01-22 22:43:16.564511015 +0000 UTC m=+0.022963749 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:43:16 np0005592767 podman[232685]: 2026-01-22 22:43:16.672732179 +0000 UTC m=+0.131184903 container init 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:43:16 np0005592767 podman[232685]: 2026-01-22 22:43:16.678842441 +0000 UTC m=+0.137295155 container start 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:43:16 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [NOTICE]   (232704) : New worker (232706) forked
Jan 22 17:43:16 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [NOTICE]   (232704) : Loading success.
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.792 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.793 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.794 182627 INFO nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Creating image(s)#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.794 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.794 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.795 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.807 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.860 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.861 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.862 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.872 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.889 182627 DEBUG nova.policy [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.924 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.924 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.953 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.954 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:16 np0005592767 nova_compute[182623]: 2026-01-22 22:43:16.954 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.006 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.008 182627 DEBUG nova.virt.disk.api [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.008 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.062 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.064 182627 DEBUG nova.virt.disk.api [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.065 182627 DEBUG nova.objects.instance [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.068 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.085 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.086 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Ensure instance console log exists: /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.086 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.086 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.087 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:17 np0005592767 nova_compute[182623]: 2026-01-22 22:43:17.610 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Successfully created port: 4a077200-6d1a-4174-ba2c-090123ed6b58 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.340 182627 DEBUG nova.compute.manager [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.341 182627 DEBUG oslo_concurrency.lockutils [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.341 182627 DEBUG oslo_concurrency.lockutils [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.341 182627 DEBUG oslo_concurrency.lockutils [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.342 182627 DEBUG nova.compute.manager [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] No waiting events found dispatching network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.342 182627 WARNING nova.compute.manager [req-7b1b7393-9387-4d42-ab8f-f02893072e12 req-a65f5a8f-35d5-448b-b20f-f85bbdb04a49 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received unexpected event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb for instance with vm_state active and task_state None.#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.875 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.876 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.877 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.878 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.878 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.896 182627 INFO nova.compute.manager [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Terminating instance#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.911 182627 DEBUG nova.compute.manager [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:43:18 np0005592767 kernel: tapc21fb745-ce (unregistering): left promiscuous mode
Jan 22 17:43:18 np0005592767 NetworkManager[54973]: <info>  [1769121798.9311] device (tapc21fb745-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.943 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:18Z|00565|binding|INFO|Releasing lport c21fb745-ced9-4786-be32-0bb9c6cb73eb from this chassis (sb_readonly=0)
Jan 22 17:43:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:18Z|00566|binding|INFO|Setting lport c21fb745-ced9-4786-be32-0bb9c6cb73eb down in Southbound
Jan 22 17:43:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:18Z|00567|binding|INFO|Removing iface tapc21fb745-ce ovn-installed in OVS
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:18.970 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:d2:3c 10.100.0.13'], port_security=['fa:16:3e:f4:d2:3c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '73618954-39df-4c9d-b2ed-36e51779ac81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cf32a9545e341c0b2eaf899f8803b45', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35bdae83-feb5-4e1c-aca1-42de2609ae93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b2c9472-340c-4b83-8804-cbd0bdffcb05, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=c21fb745-ced9-4786-be32-0bb9c6cb73eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:18.973 104135 INFO neutron.agent.ovn.metadata.agent [-] Port c21fb745-ced9-4786-be32-0bb9c6cb73eb in datapath 80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 unbound from our chassis#033[00m
Jan 22 17:43:18 np0005592767 nova_compute[182623]: 2026-01-22 22:43:18.977 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:18.978 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:43:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:18.979 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[441c9a29-0f73-4d23-b52f-a39e5b4e72e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:18.980 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 namespace which is not needed anymore#033[00m
Jan 22 17:43:18 np0005592767 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 22 17:43:18 np0005592767 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008f.scope: Consumed 2.891s CPU time.
Jan 22 17:43:19 np0005592767 systemd-machined[153912]: Machine qemu-73-instance-0000008f terminated.
Jan 22 17:43:19 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [NOTICE]   (232704) : haproxy version is 2.8.14-c23fe91
Jan 22 17:43:19 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [NOTICE]   (232704) : path to executable is /usr/sbin/haproxy
Jan 22 17:43:19 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [WARNING]  (232704) : Exiting Master process...
Jan 22 17:43:19 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [ALERT]    (232704) : Current worker (232706) exited with code 143 (Terminated)
Jan 22 17:43:19 np0005592767 neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933[232700]: [WARNING]  (232704) : All workers exited. Exiting... (0)
Jan 22 17:43:19 np0005592767 systemd[1]: libpod-1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8.scope: Deactivated successfully.
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.159 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Successfully updated port: 4a077200-6d1a-4174-ba2c-090123ed6b58 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:43:19 np0005592767 podman[232755]: 2026-01-22 22:43:19.165679459 +0000 UTC m=+0.059868020 container died 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.175 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.176 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.176 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.193 182627 INFO nova.virt.libvirt.driver [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Instance destroyed successfully.#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.194 182627 DEBUG nova.objects.instance [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lazy-loading 'resources' on Instance uuid 73618954-39df-4c9d-b2ed-36e51779ac81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8-userdata-shm.mount: Deactivated successfully.
Jan 22 17:43:19 np0005592767 systemd[1]: var-lib-containers-storage-overlay-aed42e562144efe551b4d7658e579f56ddd34121191c7ce0e139d7b57f34cfa9-merged.mount: Deactivated successfully.
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.208 182627 DEBUG nova.virt.libvirt.vif [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-867007571',display_name='tempest-ServerPasswordTestJSON-server-867007571',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-867007571',id=143,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2cf32a9545e341c0b2eaf899f8803b45',ramdisk_id='',reservation_id='r-54sdmof3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-2051664262',owner_user_name='tempest-ServerPasswordTestJSON-2051664262-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:18Z,user_data=None,user_id='2e4a08822be74c54910dd775e0f6c988',uuid=73618954-39df-4c9d-b2ed-36e51779ac81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.209 182627 DEBUG nova.network.os_vif_util [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converting VIF {"id": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "address": "fa:16:3e:f4:d2:3c", "network": {"id": "80af5ceb-00b1-4fd9-a134-f5a0f3dd4933", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1795400106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2cf32a9545e341c0b2eaf899f8803b45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc21fb745-ce", "ovs_interfaceid": "c21fb745-ced9-4786-be32-0bb9c6cb73eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.210 182627 DEBUG nova.network.os_vif_util [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.211 182627 DEBUG os_vif [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.212 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.212 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc21fb745-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.214 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.215 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.217 182627 INFO os_vif [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:d2:3c,bridge_name='br-int',has_traffic_filtering=True,id=c21fb745-ced9-4786-be32-0bb9c6cb73eb,network=Network(80af5ceb-00b1-4fd9-a134-f5a0f3dd4933),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc21fb745-ce')#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.218 182627 INFO nova.virt.libvirt.driver [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Deleting instance files /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81_del#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.218 182627 INFO nova.virt.libvirt.driver [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Deletion of /var/lib/nova/instances/73618954-39df-4c9d-b2ed-36e51779ac81_del complete#033[00m
Jan 22 17:43:19 np0005592767 podman[232755]: 2026-01-22 22:43:19.219875748 +0000 UTC m=+0.114064299 container cleanup 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:43:19 np0005592767 systemd[1]: libpod-conmon-1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8.scope: Deactivated successfully.
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.283 182627 INFO nova.compute.manager [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.284 182627 DEBUG oslo.service.loopingcall [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.285 182627 DEBUG nova.compute.manager [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.285 182627 DEBUG nova.network.neutron [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:43:19 np0005592767 podman[232803]: 2026-01-22 22:43:19.316905856 +0000 UTC m=+0.064191492 container remove 1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.322 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[33f3fedb-423f-4ee6-a578-e019f047f762]: (4, ('Thu Jan 22 10:43:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 (1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8)\n1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8\nThu Jan 22 10:43:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 (1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8)\n1c3b37b0f2c2f569a4d554ef3636212aecf6d41948f3838571299d131a2ea9b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.324 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9faca75d-1aeb-4e10-bce3-1814e27c1cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.325 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80af5ceb-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:19 np0005592767 kernel: tap80af5ceb-00: left promiscuous mode
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.361 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.372 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.375 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ff3938-b6ad-4e54-a504-6c5bc533c32a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 nova_compute[182623]: 2026-01-22 22:43:19.384 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.389 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cb03f461-b1d3-4cc0-89b5-535871917a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.390 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[415cadb7-fd59-45e4-a9b7-f3ae1b8688d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.410 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[27bbaa31-eb2c-4b45-8ed0-dca7650db655]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539258, 'reachable_time': 41493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232816, 'error': None, 'target': 'ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.415 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-80af5ceb-00b1-4fd9-a134-f5a0f3dd4933 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:43:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:19.415 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8c82834f-967f-42ab-bcad-bbca0d7cba4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:19 np0005592767 systemd[1]: run-netns-ovnmeta\x2d80af5ceb\x2d00b1\x2d4fd9\x2da134\x2df5a0f3dd4933.mount: Deactivated successfully.
Jan 22 17:43:20 np0005592767 nova_compute[182623]: 2026-01-22 22:43:20.426 182627 DEBUG nova.compute.manager [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:20 np0005592767 nova_compute[182623]: 2026-01-22 22:43:20.426 182627 DEBUG nova.compute.manager [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing instance network info cache due to event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:20 np0005592767 nova_compute[182623]: 2026-01-22 22:43:20.427 182627 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:21 np0005592767 podman[232817]: 2026-01-22 22:43:21.16423481 +0000 UTC m=+0.074907195 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.790 182627 DEBUG nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-unplugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.791 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.791 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.792 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.792 182627 DEBUG nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] No waiting events found dispatching network-vif-unplugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.792 182627 DEBUG nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-unplugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.793 182627 DEBUG nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.793 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.794 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.794 182627 DEBUG oslo_concurrency.lockutils [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.795 182627 DEBUG nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] No waiting events found dispatching network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.795 182627 WARNING nova.compute.manager [req-37debc71-40fe-4f1f-a49a-d598e5fc63cf req-5da9a5b7-117d-4fd0-9c43-73d75a160f50 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received unexpected event network-vif-plugged-c21fb745-ced9-4786-be32-0bb9c6cb73eb for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.955 182627 DEBUG nova.network.neutron [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:21 np0005592767 nova_compute[182623]: 2026-01-22 22:43:21.980 182627 INFO nova.compute.manager [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Took 2.69 seconds to deallocate network for instance.#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.050 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.050 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.128 182627 DEBUG nova.network.neutron [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.135 182627 DEBUG nova.compute.provider_tree [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.151 182627 DEBUG nova.scheduler.client.report [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.154 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.154 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance network_info: |[{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.154 182627 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.155 182627 DEBUG nova.network.neutron [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.157 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Start _get_guest_xml network_info=[{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.161 182627 WARNING nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.165 182627 DEBUG nova.virt.libvirt.host [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.166 182627 DEBUG nova.virt.libvirt.host [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.169 182627 DEBUG nova.virt.libvirt.host [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.170 182627 DEBUG nova.virt.libvirt.host [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.171 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.171 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.172 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.172 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.173 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.173 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.173 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.174 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.174 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.175 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.175 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.175 182627 DEBUG nova.virt.hardware [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.180 182627 DEBUG nova.virt.libvirt.vif [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:16Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.180 182627 DEBUG nova.network.os_vif_util [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.181 182627 DEBUG nova.network.os_vif_util [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.183 182627 DEBUG nova.objects.instance [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.190 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.196 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <uuid>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</uuid>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <name>instance-00000091</name>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkBasicOps-server-1043176814</nova:name>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:43:22</nova:creationTime>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        <nova:port uuid="4a077200-6d1a-4174-ba2c-090123ed6b58">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="serial">1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="uuid">1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:48:c1:ef"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <target dev="tap4a077200-6d"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log" append="off"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:43:22 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:43:22 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:43:22 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:43:22 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.198 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Preparing to wait for external event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.198 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.198 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.199 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.200 182627 DEBUG nova.virt.libvirt.vif [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:16Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.200 182627 DEBUG nova.network.os_vif_util [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.201 182627 DEBUG nova.network.os_vif_util [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.202 182627 DEBUG os_vif [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.202 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.203 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.203 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.206 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.206 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a077200-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.207 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a077200-6d, col_values=(('external_ids', {'iface-id': '4a077200-6d1a-4174-ba2c-090123ed6b58', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:c1:ef', 'vm-uuid': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.208 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.2094] manager: (tap4a077200-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.211 182627 INFO nova.scheduler.client.report [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Deleted allocations for instance 73618954-39df-4c9d-b2ed-36e51779ac81#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.214 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.215 182627 INFO os_vif [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d')#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.277 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.277 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.277 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:48:c1:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.279 182627 INFO nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Using config drive#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.287 182627 DEBUG oslo_concurrency.lockutils [None req-a487d017-51f9-46fd-805b-654940cd0011 2e4a08822be74c54910dd775e0f6c988 2cf32a9545e341c0b2eaf899f8803b45 - - default default] Lock "73618954-39df-4c9d-b2ed-36e51779ac81" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.565 182627 INFO nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Creating config drive at /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.570 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6xgem_4d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.695 182627 DEBUG oslo_concurrency.processutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6xgem_4d" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.7810] manager: (tap4a077200-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Jan 22 17:43:22 np0005592767 kernel: tap4a077200-6d: entered promiscuous mode
Jan 22 17:43:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:22Z|00568|binding|INFO|Claiming lport 4a077200-6d1a-4174-ba2c-090123ed6b58 for this chassis.
Jan 22 17:43:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:22Z|00569|binding|INFO|4a077200-6d1a-4174-ba2c-090123ed6b58: Claiming fa:16:3e:48:c1:ef 10.100.0.3
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.788 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.799 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.808 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:c1:ef 10.100.0.3'], port_security=['fa:16:3e:48:c1:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b8224f0-0e08-4065-b940-1530a6a30708', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6c3e3c74-0fd6-4ae4-95ed-b97b1894cf2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c373b96b-a79e-44de-a1da-4f3934614dac, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4a077200-6d1a-4174-ba2c-090123ed6b58) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.809 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4a077200-6d1a-4174-ba2c-090123ed6b58 in datapath 9b8224f0-0e08-4065-b940-1530a6a30708 bound to our chassis#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.810 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b8224f0-0e08-4065-b940-1530a6a30708#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.824 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56ad111f-fad5-4952-8f9e-c19494f24e62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.825 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b8224f0-01 in ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.829 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b8224f0-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.829 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b35036bf-09e8-43d3-94f9-98fb222ff5b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.831 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ee90016a-abdc-412d-b3b5-c7fbb7b37f17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 systemd-udevd[232863]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:22 np0005592767 systemd-machined[153912]: New machine qemu-74-instance-00000091.
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.848 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[af9d5f05-3a13-4cc1-aed2-8dd0182720d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.8557] device (tap4a077200-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.8564] device (tap4a077200-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:22 np0005592767 systemd[1]: Started Virtual Machine qemu-74-instance-00000091.
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.877 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ce1a6d-de99-481f-8f22-e80ef88f894a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:22Z|00570|binding|INFO|Setting lport 4a077200-6d1a-4174-ba2c-090123ed6b58 ovn-installed in OVS
Jan 22 17:43:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:22Z|00571|binding|INFO|Setting lport 4a077200-6d1a-4174-ba2c-090123ed6b58 up in Southbound
Jan 22 17:43:22 np0005592767 nova_compute[182623]: 2026-01-22 22:43:22.883 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.915 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f268f5-4a4c-48e3-b548-43e717b6904b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 systemd-udevd[232866]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.921 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2e66b8e6-af66-4ac5-878b-b995e2e752e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.9224] manager: (tap9b8224f0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/270)
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.954 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[540e4b97-19d1-4b91-9815-01d099d31529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.956 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[558557ed-23d1-4114-b940-6edf0a571511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:22 np0005592767 NetworkManager[54973]: <info>  [1769121802.9804] device (tap9b8224f0-00): carrier: link connected
Jan 22 17:43:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:22.986 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fddd257f-a51d-434c-8b43-b00003ee2281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.005 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7aa09d-c473-4c4e-a28b-61550f6027f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b8224f0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:ef:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539955, 'reachable_time': 18984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232894, 'error': None, 'target': 'ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.020 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9d487374-253e-4199-9e75-7236e3c1fc6e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:ef64'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539955, 'tstamp': 539955}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232895, 'error': None, 'target': 'ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.037 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6f141af8-5838-4092-915f-57b78261b84b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b8224f0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:ef:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539955, 'reachable_time': 18984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232896, 'error': None, 'target': 'ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.067 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ea6b38fb-0633-419f-9dd9-bce275b6aeb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.125 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[387e623a-a648-4ddf-a83a-d978e21ebf21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.127 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b8224f0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.127 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.128 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b8224f0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:23 np0005592767 NetworkManager[54973]: <info>  [1769121803.1404] manager: (tap9b8224f0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 22 17:43:23 np0005592767 kernel: tap9b8224f0-00: entered promiscuous mode
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.142 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b8224f0-00, col_values=(('external_ids', {'iface-id': '93ed692b-12b1-4a5e-af78-c346b15d7d6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:23 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:23Z|00572|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.143 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.145 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b8224f0-0e08-4065-b940-1530a6a30708.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b8224f0-0e08-4065-b940-1530a6a30708.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.145 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4570ced9-1de4-4c38-95d5-c99a7e84dab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.146 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-9b8224f0-0e08-4065-b940-1530a6a30708
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/9b8224f0-0e08-4065-b940-1530a6a30708.pid.haproxy
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 9b8224f0-0e08-4065-b940-1530a6a30708
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:43:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:23.147 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708', 'env', 'PROCESS_TAG=haproxy-9b8224f0-0e08-4065-b940-1530a6a30708', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b8224f0-0e08-4065-b940-1530a6a30708.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.154 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.188 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121803.187807, 1f55de0e-e258-4f65-a0e0-f26bebf85ccb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.189 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] VM Started (Lifecycle Event)#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.220 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.224 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121803.1891723, 1f55de0e-e258-4f65-a0e0-f26bebf85ccb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.225 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.247 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.250 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.279 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.385 182627 DEBUG nova.network.neutron [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated VIF entry in instance network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.386 182627 DEBUG nova.network.neutron [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.403 182627 DEBUG oslo_concurrency.lockutils [req-43efad59-4069-485e-8004-36108f1fe35f req-25dc70d1-88f7-4e66-b5a4-bd21c8517b68 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.441 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.442 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.458 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.548 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.549 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.555 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.555 182627 INFO nova.compute.claims [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:43:23 np0005592767 podman[232933]: 2026-01-22 22:43:23.575628791 +0000 UTC m=+0.061785885 container create 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:43:23 np0005592767 systemd[1]: Started libpod-conmon-5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5.scope.
Jan 22 17:43:23 np0005592767 podman[232933]: 2026-01-22 22:43:23.540651454 +0000 UTC m=+0.026808628 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:43:23 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:43:23 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26edfef5876d025259af13b4d0d144dd099a17640c04e7537764de2a0bdf9a15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:43:23 np0005592767 podman[232933]: 2026-01-22 22:43:23.676736804 +0000 UTC m=+0.162893918 container init 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:43:23 np0005592767 podman[232933]: 2026-01-22 22:43:23.689303538 +0000 UTC m=+0.175460632 container start 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.689 182627 DEBUG nova.compute.provider_tree [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.707 182627 DEBUG nova.scheduler.client.report [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:43:23 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [NOTICE]   (232952) : New worker (232954) forked
Jan 22 17:43:23 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [NOTICE]   (232952) : Loading success.
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.748 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.749 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.808 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.808 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.826 182627 INFO nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.843 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.902 182627 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Received event network-vif-deleted-c21fb745-ced9-4786-be32-0bb9c6cb73eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.902 182627 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.903 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.903 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.903 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.904 182627 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Processing event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.904 182627 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.904 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.905 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.905 182627 DEBUG oslo_concurrency.lockutils [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.905 182627 DEBUG nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.905 182627 WARNING nova.compute.manager [req-90756d25-e8b6-4b85-a75c-3d713c43c619 req-36a85c26-c46d-4a56-bcb7-de0d45e02cd8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.906 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.910 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121803.9106183, 1f55de0e-e258-4f65-a0e0-f26bebf85ccb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.911 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.913 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.918 182627 INFO nova.virt.libvirt.driver [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance spawned successfully.#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.919 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.935 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.942 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.944 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.944 182627 INFO nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Creating image(s)#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.945 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.946 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.947 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.973 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:23 np0005592767 nova_compute[182623]: 2026-01-22 22:43:23.997 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.006 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.006 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.007 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.008 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.009 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.009 182627 DEBUG nova.virt.libvirt.driver [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.019 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.029 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.030 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.030 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.046 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.073 182627 INFO nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Took 7.28 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.073 182627 DEBUG nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.103 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.104 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.146 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.149 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.149 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.185 182627 INFO nova.compute.manager [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Took 7.96 seconds to build instance.#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.205 182627 DEBUG oslo_concurrency.lockutils [None req-1f6f3fde-75dc-43fe-9905-3c6bec27928d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.206 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.206 182627 DEBUG nova.virt.disk.api [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.207 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.295 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.297 182627 DEBUG nova.virt.disk.api [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.298 182627 DEBUG nova.objects.instance [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid 683611ab-5ba3-4de8-9412-1e6a9979bfd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.314 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.315 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Ensure instance console log exists: /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.315 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.316 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.316 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:24 np0005592767 nova_compute[182623]: 2026-01-22 22:43:24.989 182627 DEBUG nova.policy [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:43:25 np0005592767 nova_compute[182623]: 2026-01-22 22:43:25.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:26 np0005592767 nova_compute[182623]: 2026-01-22 22:43:26.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:26 np0005592767 nova_compute[182623]: 2026-01-22 22:43:26.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:43:26 np0005592767 nova_compute[182623]: 2026-01-22 22:43:26.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:43:26 np0005592767 nova_compute[182623]: 2026-01-22 22:43:26.927 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.070 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.139 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.140 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.140 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.140 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.208 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:27 np0005592767 nova_compute[182623]: 2026-01-22 22:43:27.523 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Successfully created port: 12ad91b9-6082-4e56-a237-f5ee1523720d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:43:28 np0005592767 nova_compute[182623]: 2026-01-22 22:43:28.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:28 np0005592767 NetworkManager[54973]: <info>  [1769121808.1014] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 22 17:43:28 np0005592767 NetworkManager[54973]: <info>  [1769121808.1034] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 22 17:43:28 np0005592767 nova_compute[182623]: 2026-01-22 22:43:28.181 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:28 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:28Z|00573|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:43:28 np0005592767 nova_compute[182623]: 2026-01-22 22:43:28.215 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:28 np0005592767 nova_compute[182623]: 2026-01-22 22:43:28.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.085 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.108 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.109 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.110 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.111 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.143 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.144 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.145 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.146 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.152 182627 DEBUG nova.compute.manager [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.153 182627 DEBUG nova.compute.manager [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing instance network info cache due to event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.153 182627 DEBUG oslo_concurrency.lockutils [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.154 182627 DEBUG oslo_concurrency.lockutils [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.155 182627 DEBUG nova.network.neutron [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.226 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:29 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:29Z|00574|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.304 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.305 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.331 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.388 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Successfully updated port: 12ad91b9-6082-4e56-a237-f5ee1523720d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.395 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.407 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.408 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.408 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.499 182627 DEBUG nova.compute.manager [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.500 182627 DEBUG nova.compute.manager [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing instance network info cache due to event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.500 182627 DEBUG oslo_concurrency.lockutils [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.603 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.640 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.643 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5523MB free_disk=73.12035751342773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.643 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.644 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.752 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.752 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 683611ab-5ba3-4de8-9412-1e6a9979bfd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.752 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.752 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.831 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.856 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.890 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:43:29 np0005592767 nova_compute[182623]: 2026-01-22 22:43:29.890 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:30 np0005592767 nova_compute[182623]: 2026-01-22 22:43:30.678 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:30 np0005592767 nova_compute[182623]: 2026-01-22 22:43:30.679 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:30 np0005592767 nova_compute[182623]: 2026-01-22 22:43:30.697 182627 DEBUG nova.network.neutron [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated VIF entry in instance network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:30 np0005592767 nova_compute[182623]: 2026-01-22 22:43:30.698 182627 DEBUG nova.network.neutron [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:30 np0005592767 nova_compute[182623]: 2026-01-22 22:43:30.718 182627 DEBUG oslo_concurrency.lockutils [req-aed42168-3d25-47b1-8ac3-f80ef332bcc7 req-f50bc694-18e8-4f6d-9ac7-8dfff3b6c6aa 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.018 182627 DEBUG nova.network.neutron [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updating instance_info_cache with network_info: [{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.046 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.047 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Instance network_info: |[{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.048 182627 DEBUG oslo_concurrency.lockutils [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.048 182627 DEBUG nova.network.neutron [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.054 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Start _get_guest_xml network_info=[{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.059 182627 WARNING nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.064 182627 DEBUG nova.virt.libvirt.host [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.065 182627 DEBUG nova.virt.libvirt.host [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.073 182627 DEBUG nova.virt.libvirt.host [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.075 182627 DEBUG nova.virt.libvirt.host [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.077 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.078 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.078 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.079 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.079 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.080 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.080 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.081 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.081 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.082 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.082 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.083 182627 DEBUG nova.virt.hardware [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.089 182627 DEBUG nova.virt.libvirt.vif [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1046061331',display_name='tempest-TestGettingAddress-server-1046061331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1046061331',id=146,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsrHWNk+MksIsbEM7bSeBTQAY5qFK+NdGrNrV4Jug2ymXl5JnozZ6D12VJ+bihiK0cukhCA8rvVnVlSKz/acKBeNfTG+4t+iKytlBhtPIJZ7e/HRwM8zfkeN/WAYGd/Vg==',key_name='tempest-TestGettingAddress-445818544',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-icmv9k5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:23Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=683611ab-5ba3-4de8-9412-1e6a9979bfd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.089 182627 DEBUG nova.network.os_vif_util [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.091 182627 DEBUG nova.network.os_vif_util [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.092 182627 DEBUG nova.objects.instance [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 683611ab-5ba3-4de8-9412-1e6a9979bfd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.106 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <uuid>683611ab-5ba3-4de8-9412-1e6a9979bfd1</uuid>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <name>instance-00000092</name>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestGettingAddress-server-1046061331</nova:name>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:43:31</nova:creationTime>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        <nova:port uuid="12ad91b9-6082-4e56-a237-f5ee1523720d">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe13:f912" ipVersion="6"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="serial">683611ab-5ba3-4de8-9412-1e6a9979bfd1</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="uuid">683611ab-5ba3-4de8-9412-1e6a9979bfd1</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.config"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:13:f9:12"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <target dev="tap12ad91b9-60"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/console.log" append="off"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:43:31 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:43:31 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:43:31 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:43:31 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.109 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Preparing to wait for external event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.109 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.110 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.110 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.111 182627 DEBUG nova.virt.libvirt.vif [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1046061331',display_name='tempest-TestGettingAddress-server-1046061331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1046061331',id=146,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsrHWNk+MksIsbEM7bSeBTQAY5qFK+NdGrNrV4Jug2ymXl5JnozZ6D12VJ+bihiK0cukhCA8rvVnVlSKz/acKBeNfTG+4t+iKytlBhtPIJZ7e/HRwM8zfkeN/WAYGd/Vg==',key_name='tempest-TestGettingAddress-445818544',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-icmv9k5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:43:23Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=683611ab-5ba3-4de8-9412-1e6a9979bfd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.112 182627 DEBUG nova.network.os_vif_util [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.113 182627 DEBUG nova.network.os_vif_util [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.114 182627 DEBUG os_vif [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.115 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.115 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.116 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.121 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12ad91b9-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.122 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12ad91b9-60, col_values=(('external_ids', {'iface-id': '12ad91b9-6082-4e56-a237-f5ee1523720d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:f9:12', 'vm-uuid': '683611ab-5ba3-4de8-9412-1e6a9979bfd1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 NetworkManager[54973]: <info>  [1769121811.1261] manager: (tap12ad91b9-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.141 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.142 182627 INFO os_vif [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60')#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.201 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.202 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.202 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:13:f9:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.203 182627 INFO nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Using config drive#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.621 182627 INFO nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Creating config drive at /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.config#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.631 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp236bn0d5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.779 182627 DEBUG oslo_concurrency.processutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp236bn0d5" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:43:31 np0005592767 kernel: tap12ad91b9-60: entered promiscuous mode
Jan 22 17:43:31 np0005592767 NetworkManager[54973]: <info>  [1769121811.8697] manager: (tap12ad91b9-60): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:31Z|00575|binding|INFO|Claiming lport 12ad91b9-6082-4e56-a237-f5ee1523720d for this chassis.
Jan 22 17:43:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:31Z|00576|binding|INFO|12ad91b9-6082-4e56-a237-f5ee1523720d: Claiming fa:16:3e:13:f9:12 10.100.0.3 2001:db8::f816:3eff:fe13:f912
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.923 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:f9:12 10.100.0.3 2001:db8::f816:3eff:fe13:f912'], port_security=['fa:16:3e:13:f9:12 10.100.0.3 2001:db8::f816:3eff:fe13:f912'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe13:f912/64', 'neutron:device_id': '683611ab-5ba3-4de8-9412-1e6a9979bfd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eab4e1ae-84fa-4e4e-a3de-b3a819871504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b64fd7f9-9daa-4dd2-9dfa-7c863399e516, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=12ad91b9-6082-4e56-a237-f5ee1523720d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.925 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 12ad91b9-6082-4e56-a237-f5ee1523720d in datapath 3676296d-a568-47ea-b6cb-2ef8aff27f14 bound to our chassis#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.927 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3676296d-a568-47ea-b6cb-2ef8aff27f14#033[00m
Jan 22 17:43:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:31Z|00577|binding|INFO|Setting lport 12ad91b9-6082-4e56-a237-f5ee1523720d ovn-installed in OVS
Jan 22 17:43:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:31Z|00578|binding|INFO|Setting lport 12ad91b9-6082-4e56-a237-f5ee1523720d up in Southbound
Jan 22 17:43:31 np0005592767 nova_compute[182623]: 2026-01-22 22:43:31.971 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.979 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[56880962-94e9-48b7-b6f1-627f29f38a2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.981 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3676296d-a1 in ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.989 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3676296d-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.989 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b469bd1d-6b61-4899-b7f0-44433e09e1e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:31 np0005592767 systemd-machined[153912]: New machine qemu-75-instance-00000092.
Jan 22 17:43:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:31.990 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8d80a2-ff06-4542-9dc1-9afeb2b160fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 systemd-udevd[233007]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:32 np0005592767 systemd[1]: Started Virtual Machine qemu-75-instance-00000092.
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.010 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[f13b6976-985e-467a-81a2-e02e3b5c4c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 NetworkManager[54973]: <info>  [1769121812.0156] device (tap12ad91b9-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:32 np0005592767 NetworkManager[54973]: <info>  [1769121812.0164] device (tap12ad91b9-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.028 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec067d1-c11d-48e2-b32e-816eddef49d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.058 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[148c24ac-d38b-41e6-a8eb-f86515e69810]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.063 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7b56212b-a006-4ae1-af87-8685b88040da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 NetworkManager[54973]: <info>  [1769121812.0647] manager: (tap3676296d-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Jan 22 17:43:32 np0005592767 systemd-udevd[233010]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.107 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[432f8564-b3a8-4335-8852-81de76f6537a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.111 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e51a44d2-c48b-4575-aa39-a098e676b58d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 NetworkManager[54973]: <info>  [1769121812.1421] device (tap3676296d-a0): carrier: link connected
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.154 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[acee4431-a975-4d1a-b76a-db00222140a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.181 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5eeb8f98-2bda-4d1d-8a20-624c74dce8eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3676296d-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:be:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540872, 'reachable_time': 26521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233038, 'error': None, 'target': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.205 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[328102fc-3d80-4064-a5d4-dbd644b6c59a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:be97'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540872, 'tstamp': 540872}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233039, 'error': None, 'target': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.228 182627 DEBUG nova.compute.manager [req-ff4ad556-bbf3-4087-aeca-9957be1d463c req-5f781069-80d3-4b3e-8b9d-62f5955cdd74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.229 182627 DEBUG oslo_concurrency.lockutils [req-ff4ad556-bbf3-4087-aeca-9957be1d463c req-5f781069-80d3-4b3e-8b9d-62f5955cdd74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.229 182627 DEBUG oslo_concurrency.lockutils [req-ff4ad556-bbf3-4087-aeca-9957be1d463c req-5f781069-80d3-4b3e-8b9d-62f5955cdd74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.229 182627 DEBUG oslo_concurrency.lockutils [req-ff4ad556-bbf3-4087-aeca-9957be1d463c req-5f781069-80d3-4b3e-8b9d-62f5955cdd74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.230 182627 DEBUG nova.compute.manager [req-ff4ad556-bbf3-4087-aeca-9957be1d463c req-5f781069-80d3-4b3e-8b9d-62f5955cdd74 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Processing event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.230 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb914f0-1760-435b-b2ed-9eaa6e411389]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3676296d-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:be:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540872, 'reachable_time': 26521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233040, 'error': None, 'target': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.262 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17653c7a-d197-4ec7-9d38-620a61ae5f15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.330 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[45880996-3396-411d-8f8e-184bf2a31da9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.332 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3676296d-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.332 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.332 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3676296d-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.334 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:43:32 np0005592767 NetworkManager[54973]: <info>  [1769121812.3355] manager: (tap3676296d-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 22 17:43:32 np0005592767 kernel: tap3676296d-a0: entered promiscuous mode
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.335 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121812.3341062, 683611ab-5ba3-4de8-9412-1e6a9979bfd1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.338 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] VM Started (Lifecycle Event)#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.339 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3676296d-a0, col_values=(('external_ids', {'iface-id': '4c584d5e-ac75-444a-b20c-05a59b075ca2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:32Z|00579|binding|INFO|Releasing lport 4c584d5e-ac75-444a-b20c-05a59b075ca2 from this chassis (sb_readonly=0)
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.341 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.344 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.349 182627 INFO nova.virt.libvirt.driver [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Instance spawned successfully.#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.349 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.358 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.359 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3676296d-a568-47ea-b6cb-2ef8aff27f14.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3676296d-a568-47ea-b6cb-2ef8aff27f14.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.360 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e6904b-0134-4cfb-a947-148238d52f28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.361 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-3676296d-a568-47ea-b6cb-2ef8aff27f14
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/3676296d-a568-47ea-b6cb-2ef8aff27f14.pid.haproxy
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 3676296d-a568-47ea-b6cb-2ef8aff27f14
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:43:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:32.362 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'env', 'PROCESS_TAG=haproxy-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3676296d-a568-47ea-b6cb-2ef8aff27f14.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.373 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.378 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.380 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.381 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.381 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.382 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.382 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.383 182627 DEBUG nova.virt.libvirt.driver [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.464 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.466 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121812.3387778, 683611ab-5ba3-4de8-9412-1e6a9979bfd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.467 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.512 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.518 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121812.3412495, 683611ab-5ba3-4de8-9412-1e6a9979bfd1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.518 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.542 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.546 182627 INFO nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Took 8.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.547 182627 DEBUG nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.552 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.587 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.663 182627 INFO nova.compute.manager [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Took 9.15 seconds to build instance.#033[00m
Jan 22 17:43:32 np0005592767 nova_compute[182623]: 2026-01-22 22:43:32.698 182627 DEBUG oslo_concurrency.lockutils [None req-0525faca-6456-45e4-a27b-0a5ac8b47c05 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:32 np0005592767 podman[233079]: 2026-01-22 22:43:32.786829473 +0000 UTC m=+0.057813643 container create b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:43:32 np0005592767 systemd[1]: Started libpod-conmon-b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b.scope.
Jan 22 17:43:32 np0005592767 podman[233079]: 2026-01-22 22:43:32.758900424 +0000 UTC m=+0.029884624 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:43:32 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:43:32 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e03f31b288506e24ed256c65ce9e56876263451a9d8b61c0b1f9b80b24bcb52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:43:32 np0005592767 podman[233079]: 2026-01-22 22:43:32.888514282 +0000 UTC m=+0.159498482 container init b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:43:32 np0005592767 podman[233079]: 2026-01-22 22:43:32.893718089 +0000 UTC m=+0.164702259 container start b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:43:32 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [NOTICE]   (233118) : New worker (233120) forked
Jan 22 17:43:32 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [NOTICE]   (233118) : Loading success.
Jan 22 17:43:32 np0005592767 podman[233093]: 2026-01-22 22:43:32.92532691 +0000 UTC m=+0.092569873 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:43:33 np0005592767 nova_compute[182623]: 2026-01-22 22:43:33.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:33 np0005592767 nova_compute[182623]: 2026-01-22 22:43:33.937 182627 DEBUG nova.network.neutron [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updated VIF entry in instance network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:33 np0005592767 nova_compute[182623]: 2026-01-22 22:43:33.938 182627 DEBUG nova.network.neutron [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updating instance_info_cache with network_info: [{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:33 np0005592767 nova_compute[182623]: 2026-01-22 22:43:33.956 182627 DEBUG oslo_concurrency.lockutils [req-fd517ae0-4743-4251-b420-eb2b21d3754d req-70dac81c-9f2e-4614-bcfd-a9e4aa4ba6d3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.187 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121799.1814134, 73618954-39df-4c9d-b2ed-36e51779ac81 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.190 182627 INFO nova.compute.manager [-] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.231 182627 DEBUG nova.compute.manager [None req-5a9fc5fd-62cc-41f0-a5d4-4999fc626927 - - - - - -] [instance: 73618954-39df-4c9d-b2ed-36e51779ac81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.332 182627 DEBUG nova.compute.manager [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.333 182627 DEBUG oslo_concurrency.lockutils [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.333 182627 DEBUG oslo_concurrency.lockutils [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.334 182627 DEBUG oslo_concurrency.lockutils [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.334 182627 DEBUG nova.compute.manager [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] No waiting events found dispatching network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:34 np0005592767 nova_compute[182623]: 2026-01-22 22:43:34.334 182627 WARNING nova.compute.manager [req-02e5cd49-8095-4cf9-9148-3f4ebaf59811 req-3ca36db8-472f-4743-b097-c6015517a4af 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received unexpected event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d for instance with vm_state active and task_state None.#033[00m
Jan 22 17:43:36 np0005592767 nova_compute[182623]: 2026-01-22 22:43:36.126 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:36 np0005592767 podman[233144]: 2026-01-22 22:43:36.178352588 +0000 UTC m=+0.077837287 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 22 17:43:36 np0005592767 podman[233143]: 2026-01-22 22:43:36.196559702 +0000 UTC m=+0.107709390 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 17:43:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:36Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:c1:ef 10.100.0.3
Jan 22 17:43:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:36Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:c1:ef 10.100.0.3
Jan 22 17:43:36 np0005592767 nova_compute[182623]: 2026-01-22 22:43:36.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:37 np0005592767 nova_compute[182623]: 2026-01-22 22:43:37.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:39 np0005592767 nova_compute[182623]: 2026-01-22 22:43:39.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.009 182627 DEBUG nova.compute.manager [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.010 182627 DEBUG nova.compute.manager [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing instance network info cache due to event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.010 182627 DEBUG oslo_concurrency.lockutils [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.010 182627 DEBUG oslo_concurrency.lockutils [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.011 182627 DEBUG nova.network.neutron [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.130 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.293 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.839 182627 INFO nova.compute.manager [None req-d9fa5c21-36d8-4e85-859b-f934b89262e7 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Get console output#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.869 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:43:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:41.987 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:41 np0005592767 nova_compute[182623]: 2026-01-22 22:43:41.990 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:41 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:41.991 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:43:42 np0005592767 nova_compute[182623]: 2026-01-22 22:43:42.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:42 np0005592767 nova_compute[182623]: 2026-01-22 22:43:42.986 182627 DEBUG nova.network.neutron [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updated VIF entry in instance network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:42 np0005592767 nova_compute[182623]: 2026-01-22 22:43:42.987 182627 DEBUG nova.network.neutron [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updating instance_info_cache with network_info: [{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:43 np0005592767 nova_compute[182623]: 2026-01-22 22:43:43.015 182627 DEBUG oslo_concurrency.lockutils [req-84a1af77-68fd-4f2a-a397-28fa3ee2aa40 req-8538792a-d3ab-48ce-a563-a0513ac9d329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:44 np0005592767 nova_compute[182623]: 2026-01-22 22:43:44.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:44Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:f9:12 10.100.0.3
Jan 22 17:43:44 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:44Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:f9:12 10.100.0.3
Jan 22 17:43:45 np0005592767 podman[233207]: 2026-01-22 22:43:45.149324284 +0000 UTC m=+0.057227106 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:43:45 np0005592767 podman[233206]: 2026-01-22 22:43:45.151715832 +0000 UTC m=+0.064991355 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.325 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.325 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.326 182627 DEBUG nova.objects.instance [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'flavor' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.729 182627 DEBUG nova.objects.instance [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.744 182627 DEBUG nova.network.neutron [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:43:45 np0005592767 nova_compute[182623]: 2026-01-22 22:43:45.938 182627 DEBUG nova.policy [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:43:46 np0005592767 nova_compute[182623]: 2026-01-22 22:43:46.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:46 np0005592767 nova_compute[182623]: 2026-01-22 22:43:46.614 182627 DEBUG nova.network.neutron [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Successfully created port: 58e15b42-1139-4a64-ba76-2af3eca46aa1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.082 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.449 182627 DEBUG nova.network.neutron [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Successfully updated port: 58e15b42-1139-4a64-ba76-2af3eca46aa1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.466 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.467 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.467 182627 DEBUG nova.network.neutron [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.982 182627 DEBUG nova.compute.manager [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-changed-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.983 182627 DEBUG nova.compute.manager [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing instance network info cache due to event network-changed-58e15b42-1139-4a64-ba76-2af3eca46aa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:47 np0005592767 nova_compute[182623]: 2026-01-22 22:43:47.983 182627 DEBUG oslo_concurrency.lockutils [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:48.995 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:51 np0005592767 nova_compute[182623]: 2026-01-22 22:43:51.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 podman[233249]: 2026-01-22 22:43:52.164051141 +0000 UTC m=+0.064484841 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.658 182627 DEBUG nova.network.neutron [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.683 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.684 182627 DEBUG oslo_concurrency.lockutils [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.684 182627 DEBUG nova.network.neutron [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing network info cache for port 58e15b42-1139-4a64-ba76-2af3eca46aa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.686 182627 DEBUG nova.virt.libvirt.vif [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:24Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.687 182627 DEBUG nova.network.os_vif_util [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.687 182627 DEBUG nova.network.os_vif_util [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.688 182627 DEBUG os_vif [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.688 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.689 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.689 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.692 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.693 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58e15b42-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.693 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58e15b42-11, col_values=(('external_ids', {'iface-id': '58e15b42-1139-4a64-ba76-2af3eca46aa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:78:25', 'vm-uuid': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.695 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.6961] manager: (tap58e15b42-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.699 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.701 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.702 182627 INFO os_vif [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11')#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.702 182627 DEBUG nova.virt.libvirt.vif [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:24Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.703 182627 DEBUG nova.network.os_vif_util [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.703 182627 DEBUG nova.network.os_vif_util [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.706 182627 DEBUG nova.virt.libvirt.guest [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] attach device xml: <interface type="ethernet">
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:fb:78:25"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <target dev="tap58e15b42-11"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:43:52 np0005592767 nova_compute[182623]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 22 17:43:52 np0005592767 kernel: tap58e15b42-11: entered promiscuous mode
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.7162] manager: (tap58e15b42-11): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Jan 22 17:43:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:52Z|00580|binding|INFO|Claiming lport 58e15b42-1139-4a64-ba76-2af3eca46aa1 for this chassis.
Jan 22 17:43:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:52Z|00581|binding|INFO|58e15b42-1139-4a64-ba76-2af3eca46aa1: Claiming fa:16:3e:fb:78:25 10.100.0.19
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.718 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.721 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.729 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:78:25 10.100.0.19'], port_security=['fa:16:3e:fb:78:25 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28858496-9de7-4d51-8064-4ba52669cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1f24d00-ff46-49f8-bf4a-1cd04781c3bf, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=58e15b42-1139-4a64-ba76-2af3eca46aa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.731 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 58e15b42-1139-4a64-ba76-2af3eca46aa1 in datapath d9c983ad-4a50-4312-a557-2e1872b74fdf bound to our chassis#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.736 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9c983ad-4a50-4312-a557-2e1872b74fdf#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.749 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[725c9b58-13b9-42ba-88ac-8f955a4c1660]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.750 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9c983ad-41 in ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.753 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9c983ad-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.753 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5de01d5e-f19e-46fa-be5f-3756d316f2cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.753 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[301aa4ad-5fb8-441f-a77b-61b1e51a26ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.754 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:52Z|00582|binding|INFO|Setting lport 58e15b42-1139-4a64-ba76-2af3eca46aa1 ovn-installed in OVS
Jan 22 17:43:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:52Z|00583|binding|INFO|Setting lport 58e15b42-1139-4a64-ba76-2af3eca46aa1 up in Southbound
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:52 np0005592767 systemd-udevd[233280]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.769 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[9460cac8-d8f6-4b01-be7b-6f8f0930c1da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.7788] device (tap58e15b42-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.7797] device (tap58e15b42-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.784 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee41825-45ad-4684-a4d8-d91e6273d60d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.816 182627 DEBUG nova.virt.libvirt.driver [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.816 182627 DEBUG nova.virt.libvirt.driver [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.816 182627 DEBUG nova.virt.libvirt.driver [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:48:c1:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.817 182627 DEBUG nova.virt.libvirt.driver [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:fb:78:25, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.817 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa78c2f-c04b-4f50-acb1-aaa35e08080a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.823 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c5748b89-e2c8-42a0-8ec1-5d0fbd4d39ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.8246] manager: (tapd9c983ad-40): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.854 182627 DEBUG nova.virt.libvirt.guest [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:name>tempest-TestNetworkBasicOps-server-1043176814</nova:name>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:43:52</nova:creationTime>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:port uuid="4a077200-6d1a-4174-ba2c-090123ed6b58">
Jan 22 17:43:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    <nova:port uuid="58e15b42-1139-4a64-ba76-2af3eca46aa1">
Jan 22 17:43:52 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:43:52 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:43:52 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:43:52 np0005592767 nova_compute[182623]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.856 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5083f9b7-ee7a-43fb-9980-a6cca1f6f5e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.860 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e43595a7-1fdd-4472-b2c0-2241c04d16fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 NetworkManager[54973]: <info>  [1769121832.8852] device (tapd9c983ad-40): carrier: link connected
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.890 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[80503fbc-93a0-4bac-815c-e5b7c76cb7e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 nova_compute[182623]: 2026-01-22 22:43:52.895 182627 DEBUG oslo_concurrency.lockutils [None req-3162e605-fe2c-4bc9-9c62-dc1d25eb1e51 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.907 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[928f543c-4d6f-4b99-98e2-a14830e9b80c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9c983ad-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:d6:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542946, 'reachable_time': 26760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233306, 'error': None, 'target': 'ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.924 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[45f18853-7e91-4ca9-822f-fc909bd7b93b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:d66c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542946, 'tstamp': 542946}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233307, 'error': None, 'target': 'ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.941 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5d384774-26e6-4ca7-9a0e-2ec0f5c1a102]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9c983ad-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:d6:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542946, 'reachable_time': 26760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233308, 'error': None, 'target': 'ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:52.973 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6e9313ff-0107-4f82-bdf0-d13208df34a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.030 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[af79e52a-6015-4442-b2f1-5fa0d63f6f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.031 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9c983ad-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.032 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.032 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9c983ad-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:53 np0005592767 NetworkManager[54973]: <info>  [1769121833.0350] manager: (tapd9c983ad-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 22 17:43:53 np0005592767 kernel: tapd9c983ad-40: entered promiscuous mode
Jan 22 17:43:53 np0005592767 nova_compute[182623]: 2026-01-22 22:43:53.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.039 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9c983ad-40, col_values=(('external_ids', {'iface-id': '1c942ec0-2d4f-4b23-9de5-bffad84574f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:53Z|00584|binding|INFO|Releasing lport 1c942ec0-2d4f-4b23-9de5-bffad84574f0 from this chassis (sb_readonly=0)
Jan 22 17:43:53 np0005592767 nova_compute[182623]: 2026-01-22 22:43:53.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:53 np0005592767 nova_compute[182623]: 2026-01-22 22:43:53.042 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.043 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9c983ad-4a50-4312-a557-2e1872b74fdf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9c983ad-4a50-4312-a557-2e1872b74fdf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.043 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[535a7333-6588-4ff5-9977-40ba76cf670f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.045 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-d9c983ad-4a50-4312-a557-2e1872b74fdf
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/d9c983ad-4a50-4312-a557-2e1872b74fdf.pid.haproxy
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID d9c983ad-4a50-4312-a557-2e1872b74fdf
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:43:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:53.045 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'env', 'PROCESS_TAG=haproxy-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9c983ad-4a50-4312-a557-2e1872b74fdf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:43:53 np0005592767 nova_compute[182623]: 2026-01-22 22:43:53.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:53 np0005592767 podman[233340]: 2026-01-22 22:43:53.420647837 +0000 UTC m=+0.048432128 container create c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:43:53 np0005592767 systemd[1]: Started libpod-conmon-c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a.scope.
Jan 22 17:43:53 np0005592767 podman[233340]: 2026-01-22 22:43:53.396554047 +0000 UTC m=+0.024338358 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:43:53 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:43:53 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a0e6a59e6d348b369975f87c6ebd3799720c4ee42c25bd852d8be5922a1875a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:43:53 np0005592767 podman[233340]: 2026-01-22 22:43:53.520628018 +0000 UTC m=+0.148412389 container init c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:43:53 np0005592767 podman[233340]: 2026-01-22 22:43:53.525913277 +0000 UTC m=+0.153697608 container start c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:43:53 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [NOTICE]   (233360) : New worker (233362) forked
Jan 22 17:43:53 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [NOTICE]   (233360) : Loading success.
Jan 22 17:43:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:54Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:78:25 10.100.0.19
Jan 22 17:43:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:54Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:78:25 10.100.0.19
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.688 182627 DEBUG nova.compute.manager [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.688 182627 DEBUG oslo_concurrency.lockutils [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.689 182627 DEBUG oslo_concurrency.lockutils [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.689 182627 DEBUG oslo_concurrency.lockutils [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.689 182627 DEBUG nova.compute.manager [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:55 np0005592767 nova_compute[182623]: 2026-01-22 22:43:55.689 182627 WARNING nova.compute.manager [req-5ec8ff62-15cb-47d9-9a4c-201cb6619ab4 req-10db2917-fe2a-4243-8e87-f6bc04f98aeb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:43:56 np0005592767 nova_compute[182623]: 2026-01-22 22:43:56.575 182627 DEBUG nova.network.neutron [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated VIF entry in instance network info cache for port 58e15b42-1139-4a64-ba76-2af3eca46aa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:43:56 np0005592767 nova_compute[182623]: 2026-01-22 22:43:56.575 182627 DEBUG nova.network.neutron [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:43:56 np0005592767 nova_compute[182623]: 2026-01-22 22:43:56.590 182627 DEBUG oslo_concurrency.lockutils [req-fb485037-eb42-4ac7-b29b-cf3f599c527e req-ed6e4039-5521-4ed3-b215-e3f5b99dffb4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.087 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.874 182627 DEBUG nova.compute.manager [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.874 182627 DEBUG oslo_concurrency.lockutils [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.874 182627 DEBUG oslo_concurrency.lockutils [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.875 182627 DEBUG oslo_concurrency.lockutils [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.875 182627 DEBUG nova.compute.manager [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:57 np0005592767 nova_compute[182623]: 2026-01-22 22:43:57.875 182627 WARNING nova.compute.manager [req-7da18de3-94e1-4bfe-9905-4829b1bf1055 req-bf32524e-7dd6-4e3e-b686-c29e24abdfe2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.200 182627 DEBUG nova.compute.manager [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.200 182627 DEBUG nova.compute.manager [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing instance network info cache due to event network-changed-12ad91b9-6082-4e56-a237-f5ee1523720d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.201 182627 DEBUG oslo_concurrency.lockutils [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.201 182627 DEBUG oslo_concurrency.lockutils [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.201 182627 DEBUG nova.network.neutron [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Refreshing network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.245 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.245 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.245 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.246 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.246 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.259 182627 INFO nova.compute.manager [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Terminating instance#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.269 182627 DEBUG nova.compute.manager [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:43:58 np0005592767 kernel: tap12ad91b9-60 (unregistering): left promiscuous mode
Jan 22 17:43:58 np0005592767 NetworkManager[54973]: <info>  [1769121838.2941] device (tap12ad91b9-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:58Z|00585|binding|INFO|Releasing lport 12ad91b9-6082-4e56-a237-f5ee1523720d from this chassis (sb_readonly=0)
Jan 22 17:43:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:58Z|00586|binding|INFO|Setting lport 12ad91b9-6082-4e56-a237-f5ee1523720d down in Southbound
Jan 22 17:43:58 np0005592767 ovn_controller[94769]: 2026-01-22T22:43:58Z|00587|binding|INFO|Removing iface tap12ad91b9-60 ovn-installed in OVS
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.306 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.312 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:f9:12 10.100.0.3 2001:db8::f816:3eff:fe13:f912'], port_security=['fa:16:3e:13:f9:12 10.100.0.3 2001:db8::f816:3eff:fe13:f912'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe13:f912/64', 'neutron:device_id': '683611ab-5ba3-4de8-9412-1e6a9979bfd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'eab4e1ae-84fa-4e4e-a3de-b3a819871504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b64fd7f9-9daa-4dd2-9dfa-7c863399e516, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=12ad91b9-6082-4e56-a237-f5ee1523720d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.313 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 12ad91b9-6082-4e56-a237-f5ee1523720d in datapath 3676296d-a568-47ea-b6cb-2ef8aff27f14 unbound from our chassis#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.315 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3676296d-a568-47ea-b6cb-2ef8aff27f14, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.316 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[270904a1-020a-4ca3-9292-31305ff0eeca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.317 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14 namespace which is not needed anymore#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.318 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 22 17:43:58 np0005592767 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000092.scope: Consumed 13.660s CPU time.
Jan 22 17:43:58 np0005592767 systemd-machined[153912]: Machine qemu-75-instance-00000092 terminated.
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [NOTICE]   (233118) : haproxy version is 2.8.14-c23fe91
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [NOTICE]   (233118) : path to executable is /usr/sbin/haproxy
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [WARNING]  (233118) : Exiting Master process...
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [WARNING]  (233118) : Exiting Master process...
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [ALERT]    (233118) : Current worker (233120) exited with code 143 (Terminated)
Jan 22 17:43:58 np0005592767 neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14[233096]: [WARNING]  (233118) : All workers exited. Exiting... (0)
Jan 22 17:43:58 np0005592767 systemd[1]: libpod-b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b.scope: Deactivated successfully.
Jan 22 17:43:58 np0005592767 podman[233396]: 2026-01-22 22:43:58.478798876 +0000 UTC m=+0.057450942 container died b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:43:58 np0005592767 kernel: tap12ad91b9-60: entered promiscuous mode
Jan 22 17:43:58 np0005592767 kernel: tap12ad91b9-60 (unregistering): left promiscuous mode
Jan 22 17:43:58 np0005592767 NetworkManager[54973]: <info>  [1769121838.4949] manager: (tap12ad91b9-60): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.497 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b-userdata-shm.mount: Deactivated successfully.
Jan 22 17:43:58 np0005592767 systemd[1]: var-lib-containers-storage-overlay-3e03f31b288506e24ed256c65ce9e56876263451a9d8b61c0b1f9b80b24bcb52-merged.mount: Deactivated successfully.
Jan 22 17:43:58 np0005592767 podman[233396]: 2026-01-22 22:43:58.517759715 +0000 UTC m=+0.096411781 container cleanup b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:43:58 np0005592767 systemd[1]: libpod-conmon-b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b.scope: Deactivated successfully.
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.540 182627 DEBUG nova.compute.manager [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-unplugged-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.540 182627 DEBUG oslo_concurrency.lockutils [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.540 182627 DEBUG oslo_concurrency.lockutils [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.541 182627 DEBUG oslo_concurrency.lockutils [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.541 182627 DEBUG nova.compute.manager [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] No waiting events found dispatching network-vif-unplugged-12ad91b9-6082-4e56-a237-f5ee1523720d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.541 182627 DEBUG nova.compute.manager [req-d2469aec-e9fb-4c6e-a161-93593650ec67 req-27bef16b-4939-48be-8c41-a2e094315377 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-unplugged-12ad91b9-6082-4e56-a237-f5ee1523720d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.542 182627 INFO nova.virt.libvirt.driver [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Instance destroyed successfully.#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.542 182627 DEBUG nova.objects.instance [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid 683611ab-5ba3-4de8-9412-1e6a9979bfd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.556 182627 DEBUG nova.virt.libvirt.vif [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1046061331',display_name='tempest-TestGettingAddress-server-1046061331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1046061331',id=146,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDsrHWNk+MksIsbEM7bSeBTQAY5qFK+NdGrNrV4Jug2ymXl5JnozZ6D12VJ+bihiK0cukhCA8rvVnVlSKz/acKBeNfTG+4t+iKytlBhtPIJZ7e/HRwM8zfkeN/WAYGd/Vg==',key_name='tempest-TestGettingAddress-445818544',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-icmv9k5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:32Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=683611ab-5ba3-4de8-9412-1e6a9979bfd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.557 182627 DEBUG nova.network.os_vif_util [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.558 182627 DEBUG nova.network.os_vif_util [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.558 182627 DEBUG os_vif [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.560 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.561 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12ad91b9-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.567 182627 INFO os_vif [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:f9:12,bridge_name='br-int',has_traffic_filtering=True,id=12ad91b9-6082-4e56-a237-f5ee1523720d,network=Network(3676296d-a568-47ea-b6cb-2ef8aff27f14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12ad91b9-60')#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.568 182627 INFO nova.virt.libvirt.driver [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Deleting instance files /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1_del#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.569 182627 INFO nova.virt.libvirt.driver [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Deletion of /var/lib/nova/instances/683611ab-5ba3-4de8-9412-1e6a9979bfd1_del complete#033[00m
Jan 22 17:43:58 np0005592767 podman[233438]: 2026-01-22 22:43:58.59517144 +0000 UTC m=+0.052225775 container remove b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.600 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fec2bf3b-d233-4c6d-bfd8-e4cbb82f8a45]: (4, ('Thu Jan 22 10:43:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14 (b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b)\nb1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b\nThu Jan 22 10:43:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14 (b1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b)\nb1d4698edea302dbd8262d44018962ae37b23da0012b786239f2894ccf142d8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.602 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c85e61c5-c1d7-43d2-b4ab-3656085d4d66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.603 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3676296d-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 kernel: tap3676296d-a0: left promiscuous mode
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.617 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.620 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2464463d-5f6e-46d8-b790-3535a5a59bfc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.636 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0adfc2-29ca-42bb-b5ba-de3939873a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.638 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b2eebfdc-e964-4f0d-9423-5d456fe8c16b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.663 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0e407c-f254-4b9c-b3bf-41403a71b3ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540863, 'reachable_time': 16649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233454, 'error': None, 'target': 'ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 systemd[1]: run-netns-ovnmeta\x2d3676296d\x2da568\x2d47ea\x2db6cb\x2d2ef8aff27f14.mount: Deactivated successfully.
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.667 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3676296d-a568-47ea-b6cb-2ef8aff27f14 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:43:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:43:58.667 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[b961caac-e34f-4b3d-8ea7-ccfe431ce635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.696 182627 INFO nova.compute.manager [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.696 182627 DEBUG oslo.service.loopingcall [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.696 182627 DEBUG nova.compute.manager [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:43:58 np0005592767 nova_compute[182623]: 2026-01-22 22:43:58.697 182627 DEBUG nova.network.neutron [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.625 182627 DEBUG nova.compute.manager [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.625 182627 DEBUG oslo_concurrency.lockutils [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.625 182627 DEBUG oslo_concurrency.lockutils [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.626 182627 DEBUG oslo_concurrency.lockutils [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.626 182627 DEBUG nova.compute.manager [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] No waiting events found dispatching network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:00 np0005592767 nova_compute[182623]: 2026-01-22 22:44:00.626 182627 WARNING nova.compute.manager [req-62e96333-ee4b-4a1b-ab1e-d87b326d1d0e req-839688d5-e721-4e5c-8b66-cb13bbf271ed 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received unexpected event network-vif-plugged-12ad91b9-6082-4e56-a237-f5ee1523720d for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.258 182627 DEBUG nova.network.neutron [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.282 182627 INFO nova.compute.manager [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Took 2.59 seconds to deallocate network for instance.#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.379 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.380 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.448 182627 DEBUG nova.compute.provider_tree [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.475 182627 DEBUG nova.scheduler.client.report [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.499 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.519 182627 INFO nova.scheduler.client.report [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance 683611ab-5ba3-4de8-9412-1e6a9979bfd1#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.596 182627 DEBUG oslo_concurrency.lockutils [None req-36b707c7-c9af-47f2-8648-b04a80b83972 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "683611ab-5ba3-4de8-9412-1e6a9979bfd1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.838 182627 DEBUG nova.network.neutron [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updated VIF entry in instance network info cache for port 12ad91b9-6082-4e56-a237-f5ee1523720d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.838 182627 DEBUG nova.network.neutron [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Updating instance_info_cache with network_info: [{"id": "12ad91b9-6082-4e56-a237-f5ee1523720d", "address": "fa:16:3e:13:f9:12", "network": {"id": "3676296d-a568-47ea-b6cb-2ef8aff27f14", "bridge": "br-int", "label": "tempest-network-smoke--1928312095", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe13:f912", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12ad91b9-60", "ovs_interfaceid": "12ad91b9-6082-4e56-a237-f5ee1523720d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:01 np0005592767 nova_compute[182623]: 2026-01-22 22:44:01.869 182627 DEBUG oslo_concurrency.lockutils [req-030e8638-3545-4703-8365-1b2d1e5e7131 req-29bd014b-a605-44c7-a8e1-888f95b7f590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-683611ab-5ba3-4de8-9412-1e6a9979bfd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:02 np0005592767 nova_compute[182623]: 2026-01-22 22:44:02.090 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:02 np0005592767 nova_compute[182623]: 2026-01-22 22:44:02.723 182627 DEBUG nova.compute.manager [req-f98703fe-dfb1-4cee-bb22-b636019e839b req-0b77ab1d-8600-430f-ad13-0f76a716cebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Received event network-vif-deleted-12ad91b9-6082-4e56-a237-f5ee1523720d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:03 np0005592767 podman[233455]: 2026-01-22 22:44:03.154993408 +0000 UTC m=+0.078495365 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:44:03 np0005592767 nova_compute[182623]: 2026-01-22 22:44:03.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:07 np0005592767 nova_compute[182623]: 2026-01-22 22:44:07.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:07 np0005592767 podman[233477]: 2026-01-22 22:44:07.201251446 +0000 UTC m=+0.063816392 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350)
Jan 22 17:44:07 np0005592767 podman[233476]: 2026-01-22 22:44:07.256182936 +0000 UTC m=+0.122018835 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.327 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'name': 'tempest-TestNetworkBasicOps-server-1043176814', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000091', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ffd58948cb444c25ae034a02c0344de7', 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'hostId': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.339 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1f55de0e-e258-4f65-a0e0-f26bebf85ccb / tap4a077200-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.340 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1f55de0e-e258-4f65-a0e0-f26bebf85ccb / tap58e15b42-11 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.340 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.bytes volume: 28097 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.341 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c028285-42a5-4a32-9d97-60816dcf5d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28097, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.329150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc41a71c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '7e621056b6648bb7bfb38d1494280ef23643bc3b50a6cdb1cfd39df31aa5448e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1330, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.329150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc41c17a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '44e6a5e4eab31ddcc8fb0082d070b02cb093598413311cf19715571e59d8d980'}]}, 'timestamp': '2026-01-22 22:44:07.342136', '_unique_id': '7748e361aff54843b351cc5a0cec7ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.344 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.347 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.347 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40ed5778-9117-4e5d-912b-813a588d2f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.347149', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc429cf8-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': 'd2734be201dac5b6fb8ebe679ff3089a253156ac68e2708b8d3c2489bc2079cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.347149', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc42b0e4-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': 'e59372b85ffc4185621363d4dd59a6cc4fa4ac8e6db66a11885bd2d472ebeae8'}]}, 'timestamp': '2026-01-22 22:44:07.348330', '_unique_id': 'bf096bd3577e49fba35ed66f13001b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.351 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.352 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a14a0ee2-7024-4bb9-9c66-16b05b46aec5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.351320', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc434356-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '0de257b62894d7f17b1ad0214c338504a6050034cf9838f37631a5f8cb1a195d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.351320', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc43588c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '35b114d36e54154f27ece47aac125f1863049be322fa7ee911d2da334b2df629'}]}, 'timestamp': '2026-01-22 22:44:07.352582', '_unique_id': 'e410dca162ef40d797bff59a7928f7cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.355 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.356 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a20f8d0-8fe7-450b-b700-d0363db39d7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.355508', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc43e504-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '545d6e7aef3193df1142b45f7d0813ed33798bd2e9bba3d3411f3c57917675da'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.355508', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc43fb3e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': 'e633daeafae8a6a8366db5a006a35f41346e6e881d0b8626051f881b827cfa13'}]}, 'timestamp': '2026-01-22 22:44:07.356748', '_unique_id': '3f74ea44a4994551909cf83de7170880'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.359 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.360 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.bytes volume: 23842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.362 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14c90b76-4221-4f11-90e2-60f85bb0cbda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23842, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.359894', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc44aa98-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': 'c5b08580a5607bf1c7bad69d90044b6c01c830818815ab1497cd87db7f70d33a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.359894', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc44e3be-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': 'b9a1d8546a03f7bcde7f01d9ed37aad7ce8f410388fb8b1793730f13f95f96b1'}]}, 'timestamp': '2026-01-22 22:44:07.362683', '_unique_id': '53a12213e53046a4acb82104ef382ee2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.364 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.388 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/memory.usage volume: 43.75390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e80d4b-c09a-450c-b8c8-42d8d6ba8178', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.75390625, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'timestamp': '2026-01-22T22:44:07.364761', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dc48e176-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.022946049, 'message_signature': '69f3eae810e36afcb48b8c90827a591f70e6d227c772902841d1759ccb5e8bf6'}]}, 'timestamp': '2026-01-22 22:44:07.388840', '_unique_id': '988a7da296f34d4594243627c7690ccd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.389 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.390 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.391 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.391 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>]
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.391 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.408 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.408 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efc23db9-82c6-44c5-9d60-e18c57a298a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.391752', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc4bed08-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': 'af8fa4ebb5e367f68bed0990d1f2a1d1ed84f9af9ed74bb57059d76780084c4b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.391752', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc4c01d0-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': '1070ec52f25d4e5b7e4d6c652e74ac4ae6a98d53154ac470f02e47f29384b7f8'}]}, 'timestamp': '2026-01-22 22:44:07.409371', '_unique_id': 'b0331609aeb74c9d85ece37a63a974b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.412 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.412 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>]
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.413 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.413 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe08ce37-17b1-49f8-98d3-647fe5087413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.413109', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc4caa54-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '96943235e2382bc02cbfcb5c2c535898919dd743e853290cbcdadaa30570a2bf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.413109', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc4cbd82-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '743e0f7cb66eea6c194b2f1f00f799a05dc754add7450eac5613d3f0b1958331'}]}, 'timestamp': '2026-01-22 22:44:07.414132', '_unique_id': 'a2728a2b7c9c40e1a8f8b5d12601537c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.416 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.453 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.454 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47a837ef-5961-4d59-a0f3-bcd08828c37f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.416729', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc52d55a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': 'eb435c1c3cea5cb9ec4d88e2b851ad91183801e1ff9943c0f0678995de381ca6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.416729', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc52ef5e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '8f3a9698ce8735b17bb4760482c89bca598b1a135f45c40df11113e1ee4e09fa'}]}, 'timestamp': '2026-01-22 22:44:07.454753', '_unique_id': 'a21e646368a542cfb068c55ff55a2eff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.457 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.458 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45bfd48b-1ba7-41ba-afc6-081b12384113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.457930', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc5387ca-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': 'b77f1e9c351507cb94383668c0b9dc92ad8719ec881978acf62a68289528c5cf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.457930', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc53a23c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': '001b76c5cd9fea9a94c6fda4d62b423d4d8173164ac0c4426985de4ca28de749'}]}, 'timestamp': '2026-01-22 22:44:07.459448', '_unique_id': '9d1ba829b7fc404db46bcf43acc87bb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.460 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.462 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.latency volume: 9886829245 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.463 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd567ab1b-8956-499a-938f-26f4d4bc4f9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9886829245, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.462237', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc542f7c-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '9a0e4e7ffedf483dff3ac42755718e162cbeb97033ee1d5919074876fba878e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.462237', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc544caa-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': 'bc5e972cf39b7ffefdc9cebbacc1227a1d7318a922a9408eb07dc8a8c3131018'}]}, 'timestamp': '2026-01-22 22:44:07.463712', '_unique_id': 'ecf0c71a086e4f15aa2200dc516a97b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.464 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.466 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.467 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e93d40d-5380-4ad4-bc2e-6b51deb33b22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.466606', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc54d6ac-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '0065a7207c370d573048f426a80350d171c34d3a17ccb977d8b8781085b927c3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.466606', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc54ed18-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '1564f77aff0aadddc3bae02bbfdc96b5713a8bc3a3cf0d304888742b12850533'}]}, 'timestamp': '2026-01-22 22:44:07.467795', '_unique_id': 'f3f278da2ea74e0789a889e8acbe4005'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.469 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.470 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.470 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>]
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.471 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.bytes volume: 30542336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.472 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '788ce9c2-bf7f-4215-8083-53f5efa97984', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30542336, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.471592', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc5595c4-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': 'd5563c5265f82cfaae6d18860481766ee840be8e79013363d1efc4e66831ba31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.471592', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc55ab36-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': 'd1d525bdb3c66bdca23a867071dbeb9e372a85a05654abb882b5704b0cab5cad'}]}, 'timestamp': '2026-01-22 22:44:07.472649', '_unique_id': 'f1c62917e1da4a71be8bc5cfb8b4eee8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.473 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.475 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.476 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea0be1cd-720c-4aba-824c-67b498c72ac7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.475453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc562f02-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '6edeca13e5e9505a980c50b20b319bdf081fb7a405d14954c87d44812074e5e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.475453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc56438e-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '60c972802ce1f2d2135c7c9f4ca90885c5f34ce8ab08cafafb0ee414306579fc'}]}, 'timestamp': '2026-01-22 22:44:07.476555', '_unique_id': '436f945eff02451dbb6d265b5ef74183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.477 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.479 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.479 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.479 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce451895-515a-4619-bdd3-89bd5e14cef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.479427', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc56c822-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': 'b13701ed2672ca8d3a91e38480747dfeef0fd74930c581c5d48a27071ab607b9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.479427', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc56de48-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.026642193, 'message_signature': 'a2fe39869bc6bb604a1c5de592e1ea0d4fed70923b77689e4baac6181b6ac4bf'}]}, 'timestamp': '2026-01-22 22:44:07.480512', '_unique_id': 'ee6a8a1e95ba4ab7b6d02aafa0e3cdb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.481 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.483 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.483 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.bytes volume: 73043968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.483 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e61d063e-3c6b-41aa-bc4c-32660903d24c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73043968, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.483292', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc576138-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '0f612c794f4664499926e254e792ec6a8a3aa40ae409af758f668869c936627e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.483292', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc5773da-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '5912d800aabecaf3269cdfeda33ac270f1e6b090b44976a1727d373600596fec'}]}, 'timestamp': '2026-01-22 22:44:07.484366', '_unique_id': 'f282a752c14e4126b8fef1a7672becf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.485 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.486 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.487 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.487 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e22a2c9b-31ca-4f3f-8da3-5eaa2f5f953e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.487097', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc57f5ee-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '90314e19192a9a1b211bc9f3510b9ea2b6b141fec75336f9b0a370b7887be4dd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.487097', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc580b92-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '577c8886523be06af662114c8d01f961815336b57c697ba1356ac4a32e1618f3'}]}, 'timestamp': '2026-01-22 22:44:07.488280', '_unique_id': '0514cdcaeb5e49feaffe1614e9bf827c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.489 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.491 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.492 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1093608-86e5-4821-827e-924cc2851050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.491719', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc58a886-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '98a51ca371348250e27dfc930e7908cf2c3933d62fedd7daadbd8275dfb4b97f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.491719', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc58c05a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '96ed7c37a62deeddb42219157cfb6a3258bca456208d34ec5df59de100e03d9b'}]}, 'timestamp': '2026-01-22 22:44:07.492866', '_unique_id': '1a0d887d209b404a9c14a03fb0ee37a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.494 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.495 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.495 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/cpu volume: 11590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '315489d9-ba07-4be8-af1e-f99a6ee4f595', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11590000000, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'timestamp': '2026-01-22T22:44:07.495706', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dc594584-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.022946049, 'message_signature': 'fb543975522406e04cb6bc9d6eb7cb9a7b859dc02590f88c687ec62916df1ed1'}]}, 'timestamp': '2026-01-22 22:44:07.496309', '_unique_id': 'd1cce96ee5d143b7985af541a3caf6a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.497 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.499 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.latency volume: 190765821 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.499 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.device.read.latency volume: 22107670 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fa6152b-dce3-4ec4-ada5-1b86113e6691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 190765821, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-vda', 'timestamp': '2026-01-22T22:44:07.498974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc59c63a-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': 'd0cc6707c46705c8b686593ce200058c75431a420c543a869f503125ece3600b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22107670, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb-sda', 'timestamp': '2026-01-22T22:44:07.498974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'instance-00000091', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc59d904-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5444.051639019, 'message_signature': '79be4bff34b917178b14d16f921b28d7003628475b9a31f0fe2512c821c192d2'}]}, 'timestamp': '2026-01-22 22:44:07.500066', '_unique_id': 'aa73a85c16d043d68371cc1e9433d4d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.501 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.502 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.503 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets volume: 150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.503 12 DEBUG ceilometer.compute.pollsters [-] 1f55de0e-e258-4f65-a0e0-f26bebf85ccb/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a617cbf1-4b15-449b-bf6d-e7694086416a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 150, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap4a077200-6d', 'timestamp': '2026-01-22T22:44:07.502960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap4a077200-6d', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:48:c1:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a077200-6d'}, 'message_id': 'dc5a60cc-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '3cc1fb47669a02b9b5afa1d64427ed02c4a9fee3a5856ce215555999b98f235f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_name': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_name': None, 'resource_id': 'instance-00000091-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-tap58e15b42-11', 'timestamp': '2026-01-22T22:44:07.502960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1043176814', 'name': 'tap58e15b42-11', 'instance_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'instance_type': 'm1.nano', 'host': '49e62b6c121637e301161e03cae6d67a46d97868f8aab9d0f4d3f909', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:78:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58e15b42-11'}, 'message_id': 'dc5a6fb8-f7e3-11f0-a43a-fa163ed01feb', 'monotonic_time': 5443.964103699, 'message_signature': '21fcac51551d1e8181b02fb7f591306ba5879559e7e3cee5cace93b30646bb31'}]}, 'timestamp': '2026-01-22 22:44:07.503803', '_unique_id': '5f72a519d141402daed90fee29703de0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.504 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.505 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:44:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:44:07.505 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1043176814>]
Jan 22 17:44:08 np0005592767 nova_compute[182623]: 2026-01-22 22:44:08.570 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:11Z|00588|binding|INFO|Releasing lport 1c942ec0-2d4f-4b23-9de5-bffad84574f0 from this chassis (sb_readonly=0)
Jan 22 17:44:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:11Z|00589|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:44:11 np0005592767 nova_compute[182623]: 2026-01-22 22:44:11.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:12 np0005592767 nova_compute[182623]: 2026-01-22 22:44:12.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:12.115 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:12.116 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:12 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:12Z|00590|binding|INFO|Releasing lport 1c942ec0-2d4f-4b23-9de5-bffad84574f0 from this chassis (sb_readonly=0)
Jan 22 17:44:12 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:12Z|00591|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:44:12 np0005592767 nova_compute[182623]: 2026-01-22 22:44:12.620 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:13 np0005592767 nova_compute[182623]: 2026-01-22 22:44:13.537 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121838.5357769, 683611ab-5ba3-4de8-9412-1e6a9979bfd1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:13 np0005592767 nova_compute[182623]: 2026-01-22 22:44:13.538 182627 INFO nova.compute.manager [-] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:44:13 np0005592767 nova_compute[182623]: 2026-01-22 22:44:13.562 182627 DEBUG nova.compute.manager [None req-b13a1479-cb26-46ba-b779-72c8f8a3b7f0 - - - - - -] [instance: 683611ab-5ba3-4de8-9412-1e6a9979bfd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:13 np0005592767 nova_compute[182623]: 2026-01-22 22:44:13.573 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:16 np0005592767 podman[233524]: 2026-01-22 22:44:16.13246579 +0000 UTC m=+0.052169613 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:44:16 np0005592767 podman[233523]: 2026-01-22 22:44:16.143380738 +0000 UTC m=+0.063159333 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 17:44:17 np0005592767 nova_compute[182623]: 2026-01-22 22:44:17.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:18 np0005592767 nova_compute[182623]: 2026-01-22 22:44:18.220 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:18 np0005592767 nova_compute[182623]: 2026-01-22 22:44:18.575 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:20 np0005592767 nova_compute[182623]: 2026-01-22 22:44:20.537 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:21 np0005592767 nova_compute[182623]: 2026-01-22 22:44:21.944 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:22 np0005592767 nova_compute[182623]: 2026-01-22 22:44:22.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:23 np0005592767 podman[233564]: 2026-01-22 22:44:23.155166811 +0000 UTC m=+0.062872145 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:44:23 np0005592767 nova_compute[182623]: 2026-01-22 22:44:23.578 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:24 np0005592767 nova_compute[182623]: 2026-01-22 22:44:24.533 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:25 np0005592767 nova_compute[182623]: 2026-01-22 22:44:25.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.114 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.115 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.135 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.240 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.241 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.248 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.249 182627 INFO nova.compute.claims [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.359 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.359 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.380 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.452 182627 DEBUG nova.compute.provider_tree [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.467 182627 DEBUG nova.scheduler.client.report [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.471 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.487 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.487 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.489 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.496 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.496 182627 INFO nova.compute.claims [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.566 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.566 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.588 182627 INFO nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.608 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.712 182627 DEBUG nova.compute.provider_tree [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.744 182627 DEBUG nova.scheduler.client.report [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.777 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.779 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.780 182627 INFO nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Creating image(s)#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.781 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.781 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.782 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.810 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.810 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.816 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.851 182627 DEBUG nova.policy [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21487f95977a444e83139b6e5faf83ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c005f10296264b39a882736d172d2b47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.878 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.879 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.896 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.900 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.900 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.917 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.941 182627 INFO nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.957 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.958 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.965 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.980 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:26 np0005592767 nova_compute[182623]: 2026-01-22 22:44:26.980 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.013 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.015 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.015 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.081 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.082 182627 DEBUG nova.virt.disk.api [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Checking if we can resize image /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.082 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.108 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.109 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.110 182627 INFO nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Creating image(s)#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.110 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.111 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.111 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.129 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.146 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.147 182627 DEBUG nova.virt.disk.api [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Cannot resize image /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.148 182627 DEBUG nova.objects.instance [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'migration_context' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.184 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.185 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Ensure instance console log exists: /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.185 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.185 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.186 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.188 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.188 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.189 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.203 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.257 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.258 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.293 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.295 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.296 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.350 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.352 182627 DEBUG nova.virt.disk.api [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Checking if we can resize image /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.352 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.408 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.409 182627 DEBUG nova.virt.disk.api [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Cannot resize image /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.410 182627 DEBUG nova.objects.instance [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'migration_context' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.427 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.428 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Ensure instance console log exists: /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.429 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.430 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.430 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.687 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.688 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.688 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.688 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:27 np0005592767 nova_compute[182623]: 2026-01-22 22:44:27.848 182627 DEBUG nova.policy [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80fc173d19874dafa5e0cbd18c7ccf24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839eb51e89b14157b8da40ae1b480ef3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:44:28 np0005592767 nova_compute[182623]: 2026-01-22 22:44:28.213 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Successfully created port: a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:44:28 np0005592767 nova_compute[182623]: 2026-01-22 22:44:28.581 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:28 np0005592767 nova_compute[182623]: 2026-01-22 22:44:28.586 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Successfully created port: 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.284 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Successfully updated port: a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.300 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.301 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquired lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.301 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.360 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Successfully updated port: 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.377 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.377 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.378 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.467 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.495 182627 DEBUG nova.compute.manager [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-changed-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.495 182627 DEBUG nova.compute.manager [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Refreshing instance network info cache due to event network-changed-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.495 182627 DEBUG oslo_concurrency.lockutils [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.525 182627 DEBUG nova.compute.manager [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.525 182627 DEBUG nova.compute.manager [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing instance network info cache due to event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.526 182627 DEBUG oslo_concurrency.lockutils [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.547 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:44:30 np0005592767 nova_compute[182623]: 2026-01-22 22:44:30.582 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:44:32 np0005592767 nova_compute[182623]: 2026-01-22 22:44:32.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.586 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.807 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.872 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.873 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.874 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.874 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.875 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.875 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.909 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.909 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.910 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.910 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:44:33 np0005592767 nova_compute[182623]: 2026-01-22 22:44:33.993 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:34 np0005592767 podman[233620]: 2026-01-22 22:44:34.045566513 +0000 UTC m=+0.093815838 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.053 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.054 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.106 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.271 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.272 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5528MB free_disk=73.09135055541992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.273 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.273 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.444 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.445 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance f8123605-8922-47fd-b7ac-fba5cfac36d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.445 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 221039e7-b475-4211-93ed-ba13c9108ed0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.445 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.445 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.594 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.610 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.635 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.635 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.636 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.668 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.668 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.669 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.704 182627 DEBUG nova.network.neutron [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.725 182627 DEBUG nova.network.neutron [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updating instance_info_cache with network_info: [{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.730 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.730 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance network_info: |[{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.731 182627 DEBUG oslo_concurrency.lockutils [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.731 182627 DEBUG nova.network.neutron [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.734 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Start _get_guest_xml network_info=[{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.739 182627 WARNING nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.748 182627 DEBUG nova.virt.libvirt.host [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.750 182627 DEBUG nova.virt.libvirt.host [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.754 182627 DEBUG nova.virt.libvirt.host [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.755 182627 DEBUG nova.virt.libvirt.host [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.756 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.756 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.757 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.758 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.758 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.758 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.758 182627 DEBUG nova.virt.hardware [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.762 182627 DEBUG nova.virt.libvirt.vif [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:44:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-307044385',display_name='tempest-TestNetworkAdvancedServerOps-server-307044385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-307044385',id=150,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzPJfrObpNtHBfp/69vKXuFKjly5i5dFID0PcAeqQJDLNKSyZcYfO4zUUcKFCDAJBRfy8EAIOgR6Q47M2V1QqINGBnb52Cjc6aowh8v2aT2SOkhP9/GuA6sCTfRCeMlnA==',key_name='tempest-TestNetworkAdvancedServerOps-55847567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-zmc7yapg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:27Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=221039e7-b475-4211-93ed-ba13c9108ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.762 182627 DEBUG nova.network.os_vif_util [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.763 182627 DEBUG nova.network.os_vif_util [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.763 182627 DEBUG nova.objects.instance [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.765 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Releasing lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.766 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance network_info: |[{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.769 182627 DEBUG oslo_concurrency.lockutils [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.769 182627 DEBUG nova.network.neutron [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Refreshing network info cache for port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.771 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start _get_guest_xml network_info=[{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.776 182627 WARNING nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.780 182627 DEBUG nova.virt.libvirt.host [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.781 182627 DEBUG nova.virt.libvirt.host [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.784 182627 DEBUG nova.virt.libvirt.host [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.785 182627 DEBUG nova.virt.libvirt.host [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.786 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.786 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.786 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.786 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.787 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.787 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.787 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.787 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.787 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.788 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.788 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.788 182627 DEBUG nova.virt.hardware [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.791 182627 DEBUG nova.virt.libvirt.vif [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-551865481',display_name='tempest-ServerRescueTestJSON-server-551865481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-551865481',id=149,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-paf6o9sg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:26Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=f8123605-8922-47fd-b7ac-fba5cfac36d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.791 182627 DEBUG nova.network.os_vif_util [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.791 182627 DEBUG nova.network.os_vif_util [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.792 182627 DEBUG nova.objects.instance [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'pci_devices' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.822 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <uuid>221039e7-b475-4211-93ed-ba13c9108ed0</uuid>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <name>instance-00000096</name>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-307044385</nova:name>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:44:34</nova:creationTime>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:user uuid="80fc173d19874dafa5e0cbd18c7ccf24">tempest-TestNetworkAdvancedServerOps-1664122663-project-member</nova:user>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:project uuid="839eb51e89b14157b8da40ae1b480ef3">tempest-TestNetworkAdvancedServerOps-1664122663</nova:project>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:port uuid="3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="serial">221039e7-b475-4211-93ed-ba13c9108ed0</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="uuid">221039e7-b475-4211-93ed-ba13c9108ed0</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.config"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:8c:dd:9c"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="tap3b01b1bc-ac"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/console.log" append="off"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:44:34 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:44:34 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.824 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Preparing to wait for external event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.824 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.824 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.824 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.825 182627 DEBUG nova.virt.libvirt.vif [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:44:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-307044385',display_name='tempest-TestNetworkAdvancedServerOps-server-307044385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-307044385',id=150,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzPJfrObpNtHBfp/69vKXuFKjly5i5dFID0PcAeqQJDLNKSyZcYfO4zUUcKFCDAJBRfy8EAIOgR6Q47M2V1QqINGBnb52Cjc6aowh8v2aT2SOkhP9/GuA6sCTfRCeMlnA==',key_name='tempest-TestNetworkAdvancedServerOps-55847567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-zmc7yapg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:27Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=221039e7-b475-4211-93ed-ba13c9108ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.826 182627 DEBUG nova.network.os_vif_util [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.826 182627 DEBUG nova.network.os_vif_util [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.826 182627 DEBUG os_vif [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.827 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.827 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.828 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.832 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <uuid>f8123605-8922-47fd-b7ac-fba5cfac36d4</uuid>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <name>instance-00000095</name>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueTestJSON-server-551865481</nova:name>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:44:34</nova:creationTime>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:user uuid="21487f95977a444e83139b6e5faf83ce">tempest-ServerRescueTestJSON-697248807-project-member</nova:user>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:project uuid="c005f10296264b39a882736d172d2b47">tempest-ServerRescueTestJSON-697248807</nova:project>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        <nova:port uuid="a0cc4fb3-f017-4200-ae1a-59c0f99b60d0">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="serial">f8123605-8922-47fd-b7ac-fba5cfac36d4</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="uuid">f8123605-8922-47fd-b7ac-fba5cfac36d4</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:69:52:45"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <target dev="tapa0cc4fb3-f0"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/console.log" append="off"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:44:34 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:44:34 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:44:34 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:44:34 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.833 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Preparing to wait for external event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.834 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.834 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.834 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.835 182627 DEBUG nova.virt.libvirt.vif [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-551865481',display_name='tempest-ServerRescueTestJSON-server-551865481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-551865481',id=149,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-paf6o9sg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:26Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=f8123605-8922-47fd-b7ac-fba5cfac36d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.836 182627 DEBUG nova.network.os_vif_util [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.836 182627 DEBUG nova.network.os_vif_util [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.837 182627 DEBUG os_vif [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.837 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.838 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b01b1bc-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.838 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b01b1bc-ac, col_values=(('external_ids', {'iface-id': '3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:dd:9c', 'vm-uuid': '221039e7-b475-4211-93ed-ba13c9108ed0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 NetworkManager[54973]: <info>  [1769121874.8416] manager: (tap3b01b1bc-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.847 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.849 182627 INFO os_vif [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac')#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.850 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.850 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.851 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.854 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.854 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0cc4fb3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.854 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0cc4fb3-f0, col_values=(('external_ids', {'iface-id': 'a0cc4fb3-f017-4200-ae1a-59c0f99b60d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:52:45', 'vm-uuid': 'f8123605-8922-47fd-b7ac-fba5cfac36d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:34 np0005592767 NetworkManager[54973]: <info>  [1769121874.8572] manager: (tapa0cc4fb3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.857 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.867 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.869 182627 INFO os_vif [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0')#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.911 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.911 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.911 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] No VIF found with MAC fa:16:3e:8c:dd:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.912 182627 INFO nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Using config drive#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.923 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.923 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.923 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No VIF found with MAC fa:16:3e:69:52:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:44:34 np0005592767 nova_compute[182623]: 2026-01-22 22:44:34.924 182627 INFO nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Using config drive#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.550 182627 INFO nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Creating config drive at /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.config#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.562 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0v9jievz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.594 182627 INFO nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Creating config drive at /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.605 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3vh1n7a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.636 182627 DEBUG nova.compute.manager [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-changed-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.637 182627 DEBUG nova.compute.manager [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing instance network info cache due to event network-changed-58e15b42-1139-4a64-ba76-2af3eca46aa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.638 182627 DEBUG oslo_concurrency.lockutils [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.638 182627 DEBUG oslo_concurrency.lockutils [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.639 182627 DEBUG nova.network.neutron [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing network info cache for port 58e15b42-1139-4a64-ba76-2af3eca46aa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.690 182627 DEBUG oslo_concurrency.processutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0v9jievz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.738 182627 DEBUG oslo_concurrency.processutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3vh1n7a" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:35 np0005592767 kernel: tap3b01b1bc-ac: entered promiscuous mode
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.7588] manager: (tap3b01b1bc-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00592|binding|INFO|Claiming lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for this chassis.
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00593|binding|INFO|3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03: Claiming fa:16:3e:8c:dd:9c 10.100.0.14
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.762 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.781 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:dd:9c 10.100.0.14'], port_security=['fa:16:3e:8c:dd:9c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '221039e7-b475-4211-93ed-ba13c9108ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58149591-08d1-41df-aff9-e407627baa5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c5c45ae-40a8-4bb8-a1ee-71fd2a465240', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cad567a3-7471-409d-9c78-062230502d26, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.782 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 in datapath 58149591-08d1-41df-aff9-e407627baa5e bound to our chassis#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.784 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58149591-08d1-41df-aff9-e407627baa5e#033[00m
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00594|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 ovn-installed in OVS
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00595|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 up in Southbound
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.786 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 systemd-udevd[233677]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.797 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd771ac-fd65-4775-ad0a-162808906126]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.798 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58149591-01 in ovnmeta-58149591-08d1-41df-aff9-e407627baa5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8003] device (tap3b01b1bc-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8013] device (tap3b01b1bc-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.802 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58149591-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.802 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2309402d-ac1c-47f0-bc39-c20d8f422734]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.803 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[098ab46a-c471-46f6-8575-bbba2ef0b6fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 kernel: tapa0cc4fb3-f0: entered promiscuous mode
Jan 22 17:44:35 np0005592767 systemd-udevd[233685]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8104] manager: (tapa0cc4fb3-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00596|binding|INFO|Claiming lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for this chassis.
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00597|binding|INFO|a0cc4fb3-f017-4200-ae1a-59c0f99b60d0: Claiming fa:16:3e:69:52:45 10.100.0.5
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.811 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.816 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b1e6fb-f2fd-4013-bac6-2318d50ec7b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 systemd-machined[153912]: New machine qemu-76-instance-00000096.
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8229] device (tapa0cc4fb3-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8240] device (tapa0cc4fb3-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.824 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.826 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00598|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 ovn-installed in OVS
Jan 22 17:44:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:35Z|00599|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 up in Southbound
Jan 22 17:44:35 np0005592767 nova_compute[182623]: 2026-01-22 22:44:35.831 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.842 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4caf6a-dc8c-47d3-997f-278dc8e7ad3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 systemd[1]: Started Virtual Machine qemu-76-instance-00000096.
Jan 22 17:44:35 np0005592767 systemd-machined[153912]: New machine qemu-77-instance-00000095.
Jan 22 17:44:35 np0005592767 systemd[1]: Started Virtual Machine qemu-77-instance-00000095.
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.879 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[671fd8aa-b94f-4b33-9a6c-2999195d1ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.8863] manager: (tap58149591-00): new Veth device (/org/freedesktop/NetworkManager/Devices/287)
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.885 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ece1d195-00ba-4c7c-91b5-c931191d1d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.917 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[87c8d7c4-98b3-466d-82a4-1fe906c0b52c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.920 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd2ca61-5056-4253-a75b-c90a35be77c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 NetworkManager[54973]: <info>  [1769121875.9433] device (tap58149591-00): carrier: link connected
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.948 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1eed51-cfeb-4e6c-bf1c-a3eb12613a25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.963 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[21e4e84c-4bbe-4e1b-8362-e7f9cc121fe0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58149591-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fa:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547252, 'reachable_time': 38160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233728, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:35.980 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[60a99caf-5ee0-417b-a3a7-8c84e4cf99e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:fad9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547252, 'tstamp': 547252}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233729, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.004 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[68924141-0a73-4670-9262-5469a5928b47]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58149591-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fa:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547252, 'reachable_time': 38160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233730, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.038 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ad178733-c6de-4d27-8b9f-187fba50f534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.120 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[60ab1b4d-37bf-4beb-b1ff-1ed2a24fb2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.122 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58149591-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.123 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.123 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58149591-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:36 np0005592767 NetworkManager[54973]: <info>  [1769121876.1270] manager: (tap58149591-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 22 17:44:36 np0005592767 kernel: tap58149591-00: entered promiscuous mode
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.138 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58149591-00, col_values=(('external_ids', {'iface-id': '23833537-3aba-41e1-a81b-33ed3a3af74c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:36Z|00600|binding|INFO|Releasing lport 23833537-3aba-41e1-a81b-33ed3a3af74c from this chassis (sb_readonly=0)
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.142 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.153 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5ed90d-1345-4bc4-aecf-923c7309b0d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.156 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-58149591-08d1-41df-aff9-e407627baa5e
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 58149591-08d1-41df-aff9-e407627baa5e
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.157 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'env', 'PROCESS_TAG=haproxy-58149591-08d1-41df-aff9-e407627baa5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58149591-08d1-41df-aff9-e407627baa5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.272 182627 DEBUG nova.compute.manager [req-fe5c20bd-2268-4457-ae32-e37ea6b5aeb2 req-1ac6c020-f1ea-45b9-9f65-8c2f4a499579 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.273 182627 DEBUG oslo_concurrency.lockutils [req-fe5c20bd-2268-4457-ae32-e37ea6b5aeb2 req-1ac6c020-f1ea-45b9-9f65-8c2f4a499579 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.273 182627 DEBUG oslo_concurrency.lockutils [req-fe5c20bd-2268-4457-ae32-e37ea6b5aeb2 req-1ac6c020-f1ea-45b9-9f65-8c2f4a499579 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.273 182627 DEBUG oslo_concurrency.lockutils [req-fe5c20bd-2268-4457-ae32-e37ea6b5aeb2 req-1ac6c020-f1ea-45b9-9f65-8c2f4a499579 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.274 182627 DEBUG nova.compute.manager [req-fe5c20bd-2268-4457-ae32-e37ea6b5aeb2 req-1ac6c020-f1ea-45b9-9f65-8c2f4a499579 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Processing event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.522 182627 DEBUG nova.network.neutron [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updated VIF entry in instance network info cache for port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.522 182627 DEBUG nova.network.neutron [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updating instance_info_cache with network_info: [{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.543 182627 DEBUG oslo_concurrency.lockutils [req-907d35f6-b879-41ed-9043-1aa44843de29 req-d0a7718e-0ba1-4836-8b99-e9d1a53321a4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:36 np0005592767 podman[233764]: 2026-01-22 22:44:36.540584782 +0000 UTC m=+0.061536107 container create 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:44:36 np0005592767 systemd[1]: Started libpod-conmon-098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca.scope.
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.580 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121876.5804164, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.581 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:44:36 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:44:36 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ee2bca7dc47593b0c2ad70d3f9ed2f59a0a559a85a3c24b8adb5e57d878b8e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.600 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 podman[233764]: 2026-01-22 22:44:36.509765273 +0000 UTC m=+0.030716578 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.606 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121876.5833344, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.606 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:44:36 np0005592767 podman[233764]: 2026-01-22 22:44:36.609031063 +0000 UTC m=+0.129982368 container init 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 17:44:36 np0005592767 podman[233764]: 2026-01-22 22:44:36.618912132 +0000 UTC m=+0.139863407 container start 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.632 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.635 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.639 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.641 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.646 182627 INFO nova.virt.libvirt.driver [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance spawned successfully.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.646 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:44:36 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [NOTICE]   (233795) : New worker (233797) forked
Jan 22 17:44:36 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [NOTICE]   (233795) : Loading success.
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.658 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.659 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121876.6347964, 221039e7-b475-4211-93ed-ba13c9108ed0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.659 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Started (Lifecycle Event)#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.670 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.670 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.670 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.671 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.671 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.672 182627 DEBUG nova.virt.libvirt.driver [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.679 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.683 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.685 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.687 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:44:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:36.688 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[77a04cb8-0cc5-4b47-ac5f-3cacf47b3b68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.709 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.709 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121876.6350725, 221039e7-b475-4211-93ed-ba13c9108ed0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.709 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.743 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.746 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121876.6373107, 221039e7-b475-4211-93ed-ba13c9108ed0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.747 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.774 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.780 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.801 182627 INFO nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Took 9.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.802 182627 DEBUG nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.813 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.896 182627 INFO nova.compute.manager [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Took 10.46 seconds to build instance.#033[00m
Jan 22 17:44:36 np0005592767 nova_compute[182623]: 2026-01-22 22:44:36.922 182627 DEBUG oslo_concurrency.lockutils [None req-fc73793e-410f-4941-851f-8eb96b33e654 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.033 182627 DEBUG nova.network.neutron [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updated VIF entry in instance network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.034 182627 DEBUG nova.network.neutron [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.050 182627 DEBUG oslo_concurrency.lockutils [req-531b17ed-c49b-4f9f-83b4-38ff8ee96bed req-9f519d81-1ae9-48eb-8853-8044f017f671 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.122 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.315 182627 DEBUG nova.network.neutron [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated VIF entry in instance network info cache for port 58e15b42-1139-4a64-ba76-2af3eca46aa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.316 182627 DEBUG nova.network.neutron [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.597 182627 DEBUG oslo_concurrency.lockutils [req-9e910867-9ada-4297-a35d-fa612826a756 req-4faa0613-39e5-4f26-a539-6bda08d83d6a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.702 182627 DEBUG nova.compute.manager [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.702 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.703 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.703 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.703 182627 DEBUG nova.compute.manager [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Processing event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.704 182627 DEBUG nova.compute.manager [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.704 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.704 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.705 182627 DEBUG oslo_concurrency.lockutils [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.705 182627 DEBUG nova.compute.manager [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.705 182627 WARNING nova.compute.manager [req-749a4a1e-aa69-4bb5-a296-12ec79f03600 req-a40456ac-c98f-4a1c-87cc-ad31636c0b7c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.706 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.711 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121877.7110445, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.711 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.715 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.719 182627 INFO nova.virt.libvirt.driver [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance spawned successfully.#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.719 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.761 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.766 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.780 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.781 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.782 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.782 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.783 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.784 182627 DEBUG nova.virt.libvirt.driver [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.813 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.873 182627 INFO nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Took 11.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.874 182627 DEBUG nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:37 np0005592767 nova_compute[182623]: 2026-01-22 22:44:37.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.017 182627 INFO nova.compute.manager [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Took 11.82 seconds to build instance.#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.047 182627 DEBUG oslo_concurrency.lockutils [None req-2080bdc5-30cb-42f2-967c-2a830ef9edbd 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:38 np0005592767 podman[233810]: 2026-01-22 22:44:38.180726261 +0000 UTC m=+0.090516005 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter)
Jan 22 17:44:38 np0005592767 podman[233809]: 2026-01-22 22:44:38.185771933 +0000 UTC m=+0.097362278 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.362 182627 DEBUG nova.compute.manager [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.363 182627 DEBUG oslo_concurrency.lockutils [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.363 182627 DEBUG oslo_concurrency.lockutils [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.363 182627 DEBUG oslo_concurrency.lockutils [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.364 182627 DEBUG nova.compute.manager [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:38 np0005592767 nova_compute[182623]: 2026-01-22 22:44:38.364 182627 WARNING nova.compute.manager [req-2f36c63e-05cc-4180-b1a2-91695b418367 req-b7761247-60de-48cd-9e3b-e1bebfa3a252 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.121 182627 INFO nova.compute.manager [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Rescuing#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.121 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.122 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquired lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.122 182627 DEBUG nova.network.neutron [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.856 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:39 np0005592767 nova_compute[182623]: 2026-01-22 22:44:39.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:44:40 np0005592767 nova_compute[182623]: 2026-01-22 22:44:40.824 182627 DEBUG nova.network.neutron [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updating instance_info_cache with network_info: [{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:40 np0005592767 nova_compute[182623]: 2026-01-22 22:44:40.844 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Releasing lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:41 np0005592767 nova_compute[182623]: 2026-01-22 22:44:41.168 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 22 17:44:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:42.026 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:42.028 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.095 182627 DEBUG nova.compute.manager [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.095 182627 DEBUG nova.compute.manager [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing instance network info cache due to event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.096 182627 DEBUG oslo_concurrency.lockutils [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.096 182627 DEBUG oslo_concurrency.lockutils [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.097 182627 DEBUG nova.network.neutron [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:44:42 np0005592767 nova_compute[182623]: 2026-01-22 22:44:42.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:44:43 np0005592767 nova_compute[182623]: 2026-01-22 22:44:43.313 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:44:43 np0005592767 nova_compute[182623]: 2026-01-22 22:44:43.857 182627 DEBUG nova.network.neutron [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updated VIF entry in instance network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:44:43 np0005592767 nova_compute[182623]: 2026-01-22 22:44:43.857 182627 DEBUG nova.network.neutron [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:44:43 np0005592767 nova_compute[182623]: 2026-01-22 22:44:43.873 182627 DEBUG oslo_concurrency.lockutils [req-9cd8fc3a-73fb-47c4-9afa-1fdb427cb6c5 req-c33b7598-2bf3-4d9a-b475-b4ac7d18b618 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:44:44 np0005592767 nova_compute[182623]: 2026-01-22 22:44:44.858 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:47 np0005592767 nova_compute[182623]: 2026-01-22 22:44:47.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:47 np0005592767 podman[233859]: 2026-01-22 22:44:47.230828056 +0000 UTC m=+0.129564246 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:44:47 np0005592767 podman[233860]: 2026-01-22 22:44:47.231374832 +0000 UTC m=+0.133335943 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:44:49 np0005592767 nova_compute[182623]: 2026-01-22 22:44:49.861 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:50Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:dd:9c 10.100.0.14
Jan 22 17:44:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:50Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:dd:9c 10.100.0.14
Jan 22 17:44:51 np0005592767 nova_compute[182623]: 2026-01-22 22:44:51.219 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 22 17:44:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:52.031 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:52 np0005592767 nova_compute[182623]: 2026-01-22 22:44:52.183 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 kernel: tapa0cc4fb3-f0 (unregistering): left promiscuous mode
Jan 22 17:44:53 np0005592767 NetworkManager[54973]: <info>  [1769121893.4159] device (tapa0cc4fb3-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.427 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00601|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=0)
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00602|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down in Southbound
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00603|binding|INFO|Removing iface tapa0cc4fb3-f0 ovn-installed in OVS
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.439 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.441 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.444 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.443 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.447 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[308d37a8-b3dc-4f4c-aa32-8a0cb6a46394]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:53 np0005592767 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000095.scope: Deactivated successfully.
Jan 22 17:44:53 np0005592767 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000095.scope: Consumed 12.959s CPU time.
Jan 22 17:44:53 np0005592767 systemd-machined[153912]: Machine qemu-77-instance-00000095 terminated.
Jan 22 17:44:53 np0005592767 podman[233924]: 2026-01-22 22:44:53.515006729 +0000 UTC m=+0.064390191 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:44:53 np0005592767 kernel: tapa0cc4fb3-f0: entered promiscuous mode
Jan 22 17:44:53 np0005592767 kernel: tapa0cc4fb3-f0 (unregistering): left promiscuous mode
Jan 22 17:44:53 np0005592767 NetworkManager[54973]: <info>  [1769121893.6569] manager: (tapa0cc4fb3-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.665 182627 DEBUG nova.compute.manager [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.665 182627 DEBUG oslo_concurrency.lockutils [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00604|binding|INFO|Claiming lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for this chassis.
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00605|binding|INFO|a0cc4fb3-f017-4200-ae1a-59c0f99b60d0: Claiming fa:16:3e:69:52:45 10.100.0.5
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.666 182627 DEBUG oslo_concurrency.lockutils [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.666 182627 DEBUG oslo_concurrency.lockutils [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.666 182627 DEBUG nova.compute.manager [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.667 182627 WARNING nova.compute.manager [req-4fd3e7b0-5c7d-404a-9ab1-cd7fbf69cdd6 req-c54204fe-116e-4455-b6c7-4cf21b0de4a8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.668 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.675 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.678 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.681 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.683 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5f569787-53b9-456b-b885-bb07f4f24753]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.686 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00606|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 ovn-installed in OVS
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.691 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00607|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 up in Southbound
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00608|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=1)
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00609|if_status|INFO|Dropped 2 log messages in last 539 seconds (most recently, 539 seconds ago) due to excessive rate
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00610|if_status|INFO|Not setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down as sb is readonly
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00611|binding|INFO|Removing iface tapa0cc4fb3-f0 ovn-installed in OVS
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.694 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00612|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=0)
Jan 22 17:44:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:53Z|00613|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down in Southbound
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.702 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.704 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:44:53 np0005592767 nova_compute[182623]: 2026-01-22 22:44:53.705 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.706 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:44:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:53.706 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1e12e87d-00e4-423f-84ca-5f13bbea2961]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.234 182627 INFO nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance shutdown successfully after 13 seconds.#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.240 182627 INFO nova.virt.libvirt.driver [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance destroyed successfully.#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.240 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'numa_topology' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.263 182627 INFO nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Attempting rescue#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.264 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.269 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.269 182627 INFO nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Creating image(s)#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.270 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.270 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.271 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.271 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.332 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.333 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.349 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.444 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.445 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.488 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.rescue" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.489 182627 DEBUG oslo_concurrency.lockutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.490 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'migration_context' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.503 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.504 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start _get_guest_xml network_info=[{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:69:52:45"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.505 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'resources' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.526 182627 WARNING nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.534 182627 DEBUG nova.virt.libvirt.host [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.534 182627 DEBUG nova.virt.libvirt.host [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.539 182627 DEBUG nova.virt.libvirt.host [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.539 182627 DEBUG nova.virt.libvirt.host [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.541 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.541 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.541 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.542 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.543 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.543 182627 DEBUG nova.virt.hardware [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.543 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.561 182627 DEBUG nova.virt.libvirt.vif [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-551865481',display_name='tempest-ServerRescueTestJSON-server-551865481',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-551865481',id=149,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-paf6o9sg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:44:37Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=f8123605-8922-47fd-b7ac-fba5cfac36d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:69:52:45"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.562 182627 DEBUG nova.network.os_vif_util [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1371169293-network", "vif_mac": "fa:16:3e:69:52:45"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.562 182627 DEBUG nova.network.os_vif_util [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.563 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'pci_devices' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.585 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <uuid>f8123605-8922-47fd-b7ac-fba5cfac36d4</uuid>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <name>instance-00000095</name>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServerRescueTestJSON-server-551865481</nova:name>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:44:54</nova:creationTime>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:user uuid="21487f95977a444e83139b6e5faf83ce">tempest-ServerRescueTestJSON-697248807-project-member</nova:user>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:project uuid="c005f10296264b39a882736d172d2b47">tempest-ServerRescueTestJSON-697248807</nova:project>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        <nova:port uuid="a0cc4fb3-f017-4200-ae1a-59c0f99b60d0">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="serial">f8123605-8922-47fd-b7ac-fba5cfac36d4</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="uuid">f8123605-8922-47fd-b7ac-fba5cfac36d4</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.rescue"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <target dev="vdb" bus="virtio"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config.rescue"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:69:52:45"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <target dev="tapa0cc4fb3-f0"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/console.log" append="off"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:44:54 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:44:54 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:44:54 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:44:54 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.592 182627 INFO nova.virt.libvirt.driver [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance destroyed successfully.#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.645 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.646 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.646 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.646 182627 DEBUG nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] No VIF found with MAC fa:16:3e:69:52:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.647 182627 INFO nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Using config drive#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.663 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.688 182627 DEBUG nova.objects.instance [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'keypairs' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:54 np0005592767 nova_compute[182623]: 2026-01-22 22:44:54.863 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:55 np0005592767 nova_compute[182623]: 2026-01-22 22:44:55.905 182627 INFO nova.compute.manager [None req-a3835e40-98dd-4c3b-9744-ccffc2db9df4 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Get console output#033[00m
Jan 22 17:44:55 np0005592767 nova_compute[182623]: 2026-01-22 22:44:55.910 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.204 182627 INFO nova.virt.libvirt.driver [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Creating config drive at /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config.rescue#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.209 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ltxu2uw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.249 182627 DEBUG nova.objects.instance [None req-cfd6bd50-6846-43e8-9cee-f84a9cb183a6 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.272 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121896.272346, 221039e7-b475-4211-93ed-ba13c9108ed0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.272 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.294 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.299 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.314 182627 DEBUG nova.compute.manager [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.315 182627 DEBUG oslo_concurrency.lockutils [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.315 182627 DEBUG oslo_concurrency.lockutils [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.315 182627 DEBUG oslo_concurrency.lockutils [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.315 182627 DEBUG nova.compute.manager [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.315 182627 WARNING nova.compute.manager [req-c8af8748-3ec9-41fd-a28a-3a2c9ec6aa5c req-6479a6c2-6b85-4d10-98ca-e56a123839c4 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state rescuing.#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.332 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.340 182627 DEBUG oslo_concurrency.processutils [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ltxu2uw" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:44:56 np0005592767 kernel: tapa0cc4fb3-f0: entered promiscuous mode
Jan 22 17:44:56 np0005592767 systemd-udevd[233936]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:44:56 np0005592767 NetworkManager[54973]: <info>  [1769121896.4137] manager: (tapa0cc4fb3-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Jan 22 17:44:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:56Z|00614|binding|INFO|Claiming lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for this chassis.
Jan 22 17:44:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:56Z|00615|binding|INFO|a0cc4fb3-f017-4200-ae1a-59c0f99b60d0: Claiming fa:16:3e:69:52:45 10.100.0.5
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.416 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:56.425 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:56.426 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis#033[00m
Jan 22 17:44:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:56.428 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:44:56 np0005592767 NetworkManager[54973]: <info>  [1769121896.4289] device (tapa0cc4fb3-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:44:56 np0005592767 NetworkManager[54973]: <info>  [1769121896.4298] device (tapa0cc4fb3-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:44:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:56.429 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b4eb04-e121-4311-a6a2-67fe6a74f9d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:56Z|00616|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 ovn-installed in OVS
Jan 22 17:44:56 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:56Z|00617|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 up in Southbound
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.436 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:56 np0005592767 systemd-machined[153912]: New machine qemu-78-instance-00000095.
Jan 22 17:44:56 np0005592767 systemd[1]: Started Virtual Machine qemu-78-instance-00000095.
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.914 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for f8123605-8922-47fd-b7ac-fba5cfac36d4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.915 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121896.9138076, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:56 np0005592767 nova_compute[182623]: 2026-01-22 22:44:56.915 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.013 182627 DEBUG nova.compute.manager [None req-5873307b-eb31-41f7-824b-9612039181c6 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.157 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.162 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.186 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.389 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.390 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121896.9150083, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.390 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:44:57 np0005592767 kernel: tap3b01b1bc-ac (unregistering): left promiscuous mode
Jan 22 17:44:57 np0005592767 NetworkManager[54973]: <info>  [1769121897.5131] device (tap3b01b1bc-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:44:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:57Z|00618|binding|INFO|Releasing lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 from this chassis (sb_readonly=0)
Jan 22 17:44:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:57Z|00619|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 down in Southbound
Jan 22 17:44:57 np0005592767 ovn_controller[94769]: 2026-01-22T22:44:57Z|00620|binding|INFO|Removing iface tap3b01b1bc-ac ovn-installed in OVS
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.519 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.524 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.525 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.531 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.538 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:dd:9c 10.100.0.14'], port_security=['fa:16:3e:8c:dd:9c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '221039e7-b475-4211-93ed-ba13c9108ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58149591-08d1-41df-aff9-e407627baa5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c5c45ae-40a8-4bb8-a1ee-71fd2a465240', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cad567a3-7471-409d-9c78-062230502d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.540 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.541 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 in datapath 58149591-08d1-41df-aff9-e407627baa5e unbound from our chassis#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.544 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58149591-08d1-41df-aff9-e407627baa5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.545 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[df14a308-e2b7-49aa-9483-4b6d98109599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.546 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58149591-08d1-41df-aff9-e407627baa5e namespace which is not needed anymore#033[00m
Jan 22 17:44:57 np0005592767 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 22 17:44:57 np0005592767 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000096.scope: Consumed 13.718s CPU time.
Jan 22 17:44:57 np0005592767 systemd-machined[153912]: Machine qemu-76-instance-00000096 terminated.
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [NOTICE]   (233795) : haproxy version is 2.8.14-c23fe91
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [NOTICE]   (233795) : path to executable is /usr/sbin/haproxy
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [WARNING]  (233795) : Exiting Master process...
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [WARNING]  (233795) : Exiting Master process...
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [ALERT]    (233795) : Current worker (233797) exited with code 143 (Terminated)
Jan 22 17:44:57 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[233790]: [WARNING]  (233795) : All workers exited. Exiting... (0)
Jan 22 17:44:57 np0005592767 systemd[1]: libpod-098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca.scope: Deactivated successfully.
Jan 22 17:44:57 np0005592767 podman[234037]: 2026-01-22 22:44:57.700727486 +0000 UTC m=+0.051256441 container died 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:44:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca-userdata-shm.mount: Deactivated successfully.
Jan 22 17:44:57 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8ee2bca7dc47593b0c2ad70d3f9ed2f59a0a559a85a3c24b8adb5e57d878b8e7-merged.mount: Deactivated successfully.
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.737 182627 DEBUG nova.compute.manager [None req-cfd6bd50-6846-43e8-9cee-f84a9cb183a6 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:44:57 np0005592767 podman[234037]: 2026-01-22 22:44:57.745887913 +0000 UTC m=+0.096416868 container cleanup 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:44:57 np0005592767 systemd[1]: libpod-conmon-098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca.scope: Deactivated successfully.
Jan 22 17:44:57 np0005592767 podman[234081]: 2026-01-22 22:44:57.817648593 +0000 UTC m=+0.048584795 container remove 098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.824 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4471839b-21a7-406c-9074-86aaa55accc3]: (4, ('Thu Jan 22 10:44:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e (098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca)\n098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca\nThu Jan 22 10:44:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e (098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca)\n098595cbde70ace7d789930deba353469ea805550bee3ce66f6a16d23407f1ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.826 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0a89a664-1e37-4542-8ffb-5cd1af986d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.827 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58149591-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:44:57 np0005592767 kernel: tap58149591-00: left promiscuous mode
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.828 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 nova_compute[182623]: 2026-01-22 22:44:57.845 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.849 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3a708fe7-414f-48e4-88e2-550d4f76ec43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.868 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e67d59-3460-4ac8-b620-8f63fa1304be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.870 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b6b07d-7f2b-4c29-9a4d-2708544c0e78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.886 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[051e2848-499d-4f4a-a8fe-36443ece0003]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547245, 'reachable_time': 26035, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234097, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:57 np0005592767 systemd[1]: run-netns-ovnmeta\x2d58149591\x2d08d1\x2d41df\x2daff9\x2de407627baa5e.mount: Deactivated successfully.
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.891 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58149591-08d1-41df-aff9-e407627baa5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:44:57 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:44:57.891 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[75777a33-3afb-4c0d-94c8-3ebaea72a58d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.440 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.441 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.441 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.442 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.442 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.443 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.443 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.444 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.444 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.445 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.445 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.446 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.446 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.446 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.447 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.447 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.448 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.448 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.449 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.449 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.450 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.450 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.450 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.451 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.451 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.452 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.452 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.453 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.453 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.454 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.454 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.454 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.455 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.455 182627 DEBUG oslo_concurrency.lockutils [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.456 182627 DEBUG nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:58 np0005592767 nova_compute[182623]: 2026-01-22 22:44:58.456 182627 WARNING nova.compute.manager [req-8d6a5fa0-7c3a-4953-8cb7-ca69db04801c req-ea0f4cb9-4ec1-4245-8618-00d832791445 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state None.#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.862 182627 DEBUG nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.863 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.864 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.864 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.864 182627 DEBUG nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.865 182627 WARNING nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.865 182627 DEBUG nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.866 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.866 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.866 182627 DEBUG oslo_concurrency.lockutils [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.867 182627 DEBUG nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.867 182627 WARNING nova.compute.manager [req-cb3e3ff3-7a1d-4445-8bdb-6f8aa7ca8a77 req-e425ee9e-3f6b-4f7b-8dd9-e91fec0c324b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.868 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.922 182627 INFO nova.compute.manager [None req-a867e48f-d165-4b6d-bb0c-bf279e58d279 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Get console output#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.942 182627 INFO nova.compute.manager [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Unrescuing#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.944 182627 DEBUG oslo_concurrency.lockutils [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.944 182627 DEBUG oslo_concurrency.lockutils [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquired lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:44:59 np0005592767 nova_compute[182623]: 2026-01-22 22:44:59.945 182627 DEBUG nova.network.neutron [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:45:00 np0005592767 nova_compute[182623]: 2026-01-22 22:45:00.353 182627 INFO nova.compute.manager [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Resuming#033[00m
Jan 22 17:45:00 np0005592767 nova_compute[182623]: 2026-01-22 22:45:00.353 182627 DEBUG nova.objects.instance [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'flavor' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:00 np0005592767 nova_compute[182623]: 2026-01-22 22:45:00.395 182627 DEBUG oslo_concurrency.lockutils [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:00 np0005592767 nova_compute[182623]: 2026-01-22 22:45:00.396 182627 DEBUG oslo_concurrency.lockutils [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquired lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:00 np0005592767 nova_compute[182623]: 2026-01-22 22:45:00.396 182627 DEBUG nova.network.neutron [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.348 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-58e15b42-1139-4a64-ba76-2af3eca46aa1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.349 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-58e15b42-1139-4a64-ba76-2af3eca46aa1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.362 182627 DEBUG nova.objects.instance [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'flavor' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.391 182627 DEBUG nova.virt.libvirt.vif [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:24Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.391 182627 DEBUG nova.network.os_vif_util [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.392 182627 DEBUG nova.network.os_vif_util [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.398 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.400 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.403 182627 DEBUG nova.virt.libvirt.driver [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Attempting to detach device tap58e15b42-11 from instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.403 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:fb:78:25"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <target dev="tap58e15b42-11"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.409 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.413 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <name>instance-00000091</name>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <uuid>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</uuid>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:name>tempest-TestNetworkBasicOps-server-1043176814</nova:name>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:43:52</nova:creationTime>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:port uuid="4a077200-6d1a-4174-ba2c-090123ed6b58">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:port uuid="58e15b42-1139-4a64-ba76-2af3eca46aa1">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <memory unit='KiB'>131072</memory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <vcpu placement='static'>1</vcpu>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <resource>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <partition>/machine</partition>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </resource>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <sysinfo type='smbios'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='manufacturer'>RDO</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='serial'>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='uuid'>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='family'>Virtual Machine</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <boot dev='hd'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <smbios mode='sysinfo'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <vmcoreinfo state='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <model fallback='forbid'>Nehalem</model>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='x2apic'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='hypervisor'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='vme'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <clock offset='utc'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='hpet' present='no'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_poweroff>destroy</on_poweroff>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_reboot>restart</on_reboot>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_crash>destroy</on_crash>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <disk type='file' device='disk'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk' index='2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backingStore type='file' index='3'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <format type='raw'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <backingStore/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      </backingStore>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='vda' bus='virtio'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='virtio-disk0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <disk type='file' device='cdrom'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config' index='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backingStore/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='sda' bus='sata'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <readonly/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='sata0-0-0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pcie.0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='1' port='0x10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='2' port='0x11'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='3' port='0x12'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='4' port='0x13'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='5' port='0x14'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='6' port='0x15'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='7' port='0x16'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='8' port='0x17'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.8'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='9' port='0x18'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.9'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='10' port='0x19'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='11' port='0x1a'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.11'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='12' port='0x1b'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.12'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='13' port='0x1c'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.13'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='14' port='0x1d'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.14'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='15' port='0x1e'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.15'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='16' port='0x1f'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.16'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='17' port='0x20'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.17'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='18' port='0x21'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.18'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='19' port='0x22'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.19'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='20' port='0x23'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.20'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='21' port='0x24'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.21'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='22' port='0x25'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.22'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='23' port='0x26'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.23'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='24' port='0x27'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.24'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='25' port='0x28'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.25'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-pci-bridge'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.26'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='usb'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='sata' index='0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='ide'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:48:c1:ef'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='tap4a077200-6d'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='net0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:fb:78:25'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='tap58e15b42-11'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='net1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <serial type='pty'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log' append='off'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target type='isa-serial' port='0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <model name='isa-serial'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      </target>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log' append='off'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target type='serial' port='0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='tablet' bus='usb'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='usb' bus='0' port='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='mouse' bus='ps2'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='keyboard' bus='ps2'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <listen type='address' address='::0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <audio id='1' type='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='video0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <watchdog model='itco' action='reset'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='watchdog0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </watchdog>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <memballoon model='virtio'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <stats period='10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='balloon0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <rng model='virtio'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backend model='random'>/dev/urandom</backend>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='rng0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <label>system_u:system_r:svirt_t:s0:c234,c989</label>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c234,c989</imagelabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <label>+107:+107</label>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <imagelabel>+107:+107</imagelabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.416 182627 INFO nova.virt.libvirt.driver [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully detached device tap58e15b42-11 from instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb from the persistent domain config.#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.417 182627 DEBUG nova.virt.libvirt.driver [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] (1/8): Attempting to detach device tap58e15b42-11 with device alias net1 from instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.417 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] detach device xml: <interface type="ethernet">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <mac address="fa:16:3e:fb:78:25"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <model type="virtio"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <mtu size="1442"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <target dev="tap58e15b42-11"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </interface>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 22 17:45:01 np0005592767 kernel: tap58e15b42-11 (unregistering): left promiscuous mode
Jan 22 17:45:01 np0005592767 NetworkManager[54973]: <info>  [1769121901.5281] device (tap58e15b42-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.542 182627 DEBUG nova.virt.libvirt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Received event <DeviceRemovedEvent: 1769121901.5408034, 1f55de0e-e258-4f65-a0e0-f26bebf85ccb => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00621|binding|INFO|Releasing lport 58e15b42-1139-4a64-ba76-2af3eca46aa1 from this chassis (sb_readonly=0)
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00622|binding|INFO|Setting lport 58e15b42-1139-4a64-ba76-2af3eca46aa1 down in Southbound
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00623|binding|INFO|Removing iface tap58e15b42-11 ovn-installed in OVS
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.550 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.553 182627 DEBUG nova.virt.libvirt.driver [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Start waiting for the detach event from libvirt for device tap58e15b42-11 with device alias net1 for instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.554 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.556 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:78:25 10.100.0.19', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1f24d00-ff46-49f8-bf4a-1cd04781c3bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=58e15b42-1139-4a64-ba76-2af3eca46aa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.562 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:fb:78:25"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap58e15b42-11"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <name>instance-00000091</name>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <uuid>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</uuid>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:name>tempest-TestNetworkBasicOps-server-1043176814</nova:name>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:43:52</nova:creationTime>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:port uuid="4a077200-6d1a-4174-ba2c-090123ed6b58">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:port uuid="58e15b42-1139-4a64-ba76-2af3eca46aa1">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <memory unit='KiB'>131072</memory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <vcpu placement='static'>1</vcpu>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <resource>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <partition>/machine</partition>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </resource>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <sysinfo type='smbios'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='manufacturer'>RDO</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='product'>OpenStack Compute</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='serial'>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='uuid'>1f55de0e-e258-4f65-a0e0-f26bebf85ccb</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <entry name='family'>Virtual Machine</entry>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <boot dev='hd'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <smbios mode='sysinfo'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <vmcoreinfo state='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <cpu mode='custom' match='exact' check='full'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <model fallback='forbid'>Nehalem</model>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='x2apic'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='hypervisor'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <feature policy='require' name='vme'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <clock offset='utc'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='pit' tickpolicy='delay'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <timer name='hpet' present='no'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_poweroff>destroy</on_poweroff>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_reboot>restart</on_reboot>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <on_crash>destroy</on_crash>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <disk type='file' device='disk'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk' index='2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backingStore type='file' index='3'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <format type='raw'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <source file='/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <backingStore/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      </backingStore>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='vda' bus='virtio'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='virtio-disk0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <disk type='file' device='cdrom'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='qemu' type='raw' cache='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/disk.config' index='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backingStore/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='sda' bus='sata'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <readonly/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='sata0-0-0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='0' model='pcie-root'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pcie.0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='1' port='0x10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='2' port='0x11'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='3' port='0x12'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='4' port='0x13'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='5' port='0x14'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='6' port='0x15'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='7' port='0x16'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='8' port='0x17'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.8'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='9' port='0x18'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.9'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='10' port='0x19'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='11' port='0x1a'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.11'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='12' port='0x1b'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.12'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='13' port='0x1c'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.13'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='14' port='0x1d'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.14'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='15' port='0x1e'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.15'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.561 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 58e15b42-1139-4a64-ba76-2af3eca46aa1 in datapath d9c983ad-4a50-4312-a557-2e1872b74fdf unbound from our chassis#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='16' port='0x1f'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.16'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='17' port='0x20'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.17'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='18' port='0x21'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.18'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='19' port='0x22'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.19'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='20' port='0x23'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.20'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='21' port='0x24'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.21'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='22' port='0x25'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.22'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='23' port='0x26'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.23'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='24' port='0x27'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.24'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-root-port'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target chassis='25' port='0x28'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.25'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model name='pcie-pci-bridge'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='pci.26'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='usb'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <controller type='sata' index='0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='ide'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </controller>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <interface type='ethernet'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mac address='fa:16:3e:48:c1:ef'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target dev='tap4a077200-6d'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model type='virtio'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <driver name='vhost' rx_queue_size='512'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <mtu size='1442'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='net0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <serial type='pty'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log' append='off'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target type='isa-serial' port='0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:        <model name='isa-serial'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      </target>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <console type='pty' tty='/dev/pts/0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <source path='/dev/pts/0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <log file='/var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb/console.log' append='off'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <target type='serial' port='0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='serial0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </console>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='tablet' bus='usb'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='usb' bus='0' port='1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='mouse' bus='ps2'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input1'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <input type='keyboard' bus='ps2'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='input2'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </input>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <listen type='address' address='::0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </graphics>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <audio id='1' type='none'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <model type='virtio' heads='1' primary='yes'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='video0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <watchdog model='itco' action='reset'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='watchdog0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </watchdog>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <memballoon model='virtio'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <stats period='10'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='balloon0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <rng model='virtio'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <backend model='random'>/dev/urandom</backend>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <alias name='rng0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <label>system_u:system_r:svirt_t:s0:c234,c989</label>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c234,c989</imagelabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <label>+107:+107</label>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <imagelabel>+107:+107</imagelabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </seclabel>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.562 182627 INFO nova.virt.libvirt.driver [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully detached device tap58e15b42-11 from instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb from the live domain config.#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.564 182627 DEBUG nova.virt.libvirt.vif [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:24Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.565 182627 DEBUG nova.network.os_vif_util [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "address": "fa:16:3e:fb:78:25", "network": {"id": "d9c983ad-4a50-4312-a557-2e1872b74fdf", "bridge": "br-int", "label": "tempest-network-smoke--522017006", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58e15b42-11", "ovs_interfaceid": "58e15b42-1139-4a64-ba76-2af3eca46aa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.566 182627 DEBUG nova.network.os_vif_util [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.567 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9c983ad-4a50-4312-a557-2e1872b74fdf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.567 182627 DEBUG os_vif [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.569 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[296c347a-7eaa-4247-8101-3024366e7b6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.571 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.571 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf namespace which is not needed anymore#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.572 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58e15b42-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.580 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.583 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.587 182627 INFO os_vif [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:78:25,bridge_name='br-int',has_traffic_filtering=True,id=58e15b42-1139-4a64-ba76-2af3eca46aa1,network=Network(d9c983ad-4a50-4312-a557-2e1872b74fdf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58e15b42-11')#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.588 182627 DEBUG nova.virt.libvirt.guest [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:name>tempest-TestNetworkBasicOps-server-1043176814</nova:name>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:creationTime>2026-01-22 22:45:01</nova:creationTime>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:flavor name="m1.nano">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:memory>128</nova:memory>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:disk>1</nova:disk>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:swap>0</nova:swap>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:flavor>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:owner>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  <nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    <nova:port uuid="4a077200-6d1a-4174-ba2c-090123ed6b58">
Jan 22 17:45:01 np0005592767 nova_compute[182623]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:    </nova:port>
Jan 22 17:45:01 np0005592767 nova_compute[182623]:  </nova:ports>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: </nova:instance>
Jan 22 17:45:01 np0005592767 nova_compute[182623]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [NOTICE]   (233360) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [NOTICE]   (233360) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [WARNING]  (233360) : Exiting Master process...
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [WARNING]  (233360) : Exiting Master process...
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [ALERT]    (233360) : Current worker (233362) exited with code 143 (Terminated)
Jan 22 17:45:01 np0005592767 neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf[233356]: [WARNING]  (233360) : All workers exited. Exiting... (0)
Jan 22 17:45:01 np0005592767 systemd[1]: libpod-c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a.scope: Deactivated successfully.
Jan 22 17:45:01 np0005592767 podman[234125]: 2026-01-22 22:45:01.732977722 +0000 UTC m=+0.050571401 container died c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.756 182627 DEBUG nova.network.neutron [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updating instance_info_cache with network_info: [{"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:01 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:01 np0005592767 systemd[1]: var-lib-containers-storage-overlay-9a0e6a59e6d348b369975f87c6ebd3799720c4ee42c25bd852d8be5922a1875a-merged.mount: Deactivated successfully.
Jan 22 17:45:01 np0005592767 podman[234125]: 2026-01-22 22:45:01.779060786 +0000 UTC m=+0.096654445 container cleanup c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 17:45:01 np0005592767 systemd[1]: libpod-conmon-c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a.scope: Deactivated successfully.
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.791 182627 DEBUG oslo_concurrency.lockutils [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Releasing lock "refresh_cache-f8123605-8922-47fd-b7ac-fba5cfac36d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.792 182627 DEBUG nova.objects.instance [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'flavor' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:01 np0005592767 kernel: tapa0cc4fb3-f0 (unregistering): left promiscuous mode
Jan 22 17:45:01 np0005592767 NetworkManager[54973]: <info>  [1769121901.8504] device (tapa0cc4fb3-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.857 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00624|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=0)
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00625|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down in Southbound
Jan 22 17:45:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:01Z|00626|binding|INFO|Removing iface tapa0cc4fb3-f0 ovn-installed in OVS
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.860 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.866 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:01 np0005592767 podman[234155]: 2026-01-22 22:45:01.868937998 +0000 UTC m=+0.052376522 container remove c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.876 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[481390b5-f65d-4f4f-9da8-afa87280a758]: (4, ('Thu Jan 22 10:45:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf (c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a)\nc22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a\nThu Jan 22 10:45:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf (c22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a)\nc22e2f93c66fd63e4f9c2e9414ba077b9c327aa85916245484669fb359c74f1a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.877 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.878 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1490b6bf-adfe-40c3-a223-2607ce9144f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.879 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9c983ad-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.880 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 kernel: tapd9c983ad-40: left promiscuous mode
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.897 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.899 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4633cd8b-9052-4c65-98f7-3542c159c19b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000095.scope: Deactivated successfully.
Jan 22 17:45:01 np0005592767 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000095.scope: Consumed 5.224s CPU time.
Jan 22 17:45:01 np0005592767 systemd-machined[153912]: Machine qemu-78-instance-00000095 terminated.
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.913 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3775ccc8-ec35-4b23-b82c-c6be3377ff25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.915 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5167d900-e581-4547-8d26-07f7ff98abba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.929 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2a2d0106-f91d-45e6-8170-b901c7caaf1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542938, 'reachable_time': 43187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234181, 'error': None, 'target': 'ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.931 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9c983ad-4a50-4312-a557-2e1872b74fdf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.931 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3d00570a-b4f3-4352-b309-06003ee512a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.932 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:45:01 np0005592767 systemd[1]: run-netns-ovnmeta\x2dd9c983ad\x2d4a50\x2d4312\x2da557\x2d2e1872b74fdf.mount: Deactivated successfully.
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.933 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:45:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:01.934 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[085510d6-0b1e-46aa-afc5-b44fdb4a5c31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.960 182627 DEBUG nova.compute.manager [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-unplugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.960 182627 DEBUG oslo_concurrency.lockutils [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.960 182627 DEBUG oslo_concurrency.lockutils [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.960 182627 DEBUG oslo_concurrency.lockutils [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.960 182627 DEBUG nova.compute.manager [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-unplugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.961 182627 WARNING nova.compute.manager [req-9badd3a4-e368-4685-b6bd-ca01c6d4a4f2 req-6ec0c0c9-61c6-4357-b871-b7bf258f0ef5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-unplugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.964 182627 DEBUG nova.network.neutron [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.983 182627 DEBUG oslo_concurrency.lockutils [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Releasing lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.987 182627 DEBUG nova.virt.libvirt.vif [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:44:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-307044385',display_name='tempest-TestNetworkAdvancedServerOps-server-307044385',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-307044385',id=150,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzPJfrObpNtHBfp/69vKXuFKjly5i5dFID0PcAeqQJDLNKSyZcYfO4zUUcKFCDAJBRfy8EAIOgR6Q47M2V1QqINGBnb52Cjc6aowh8v2aT2SOkhP9/GuA6sCTfRCeMlnA==',key_name='tempest-TestNetworkAdvancedServerOps-55847567',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-zmc7yapg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:44:57Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=221039e7-b475-4211-93ed-ba13c9108ed0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.988 182627 DEBUG nova.network.os_vif_util [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.988 182627 DEBUG nova.network.os_vif_util [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.989 182627 DEBUG os_vif [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.989 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.989 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.990 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.992 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.993 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b01b1bc-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.993 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b01b1bc-ac, col_values=(('external_ids', {'iface-id': '3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:dd:9c', 'vm-uuid': '221039e7-b475-4211-93ed-ba13c9108ed0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.993 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:01 np0005592767 nova_compute[182623]: 2026-01-22 22:45:01.994 182627 INFO os_vif [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac')#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.013 182627 DEBUG nova.objects.instance [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:02 np0005592767 kernel: tapa0cc4fb3-f0: entered promiscuous mode
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.0549] manager: (tapa0cc4fb3-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/291)
Jan 22 17:45:02 np0005592767 systemd-udevd[234106]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:02 np0005592767 kernel: tapa0cc4fb3-f0 (unregistering): left promiscuous mode
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00627|binding|INFO|Claiming lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for this chassis.
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.062 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00628|binding|INFO|a0cc4fb3-f017-4200-ae1a-59c0f99b60d0: Claiming fa:16:3e:69:52:45 10.100.0.5
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.069 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.071 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 bound to our chassis#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.072 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.073 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b5451c-9b4f-4413-98a6-8c224e7aa67a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.089 182627 DEBUG nova.compute.manager [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.090 182627 DEBUG oslo_concurrency.lockutils [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.091 182627 DEBUG oslo_concurrency.lockutils [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.091 182627 DEBUG oslo_concurrency.lockutils [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.091 182627 DEBUG nova.compute.manager [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.092 182627 WARNING nova.compute.manager [req-5578e8fc-5ee2-40a0-ab57-fe1fcf5ddd77 req-158179d1-0cd4-46d2-8a12-1b9960d0a4c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.092 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.098 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00629|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 ovn-installed in OVS
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00630|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 up in Southbound
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00631|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=1)
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00632|binding|INFO|Removing iface tapa0cc4fb3-f0 ovn-installed in OVS
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00633|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=0)
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00634|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down in Southbound
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.109 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.110 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.111 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.112 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.112 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fce65f74-884c-45f9-8fe6-1bc92d8a629e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.132 182627 INFO nova.virt.libvirt.driver [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance destroyed successfully.#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.132 182627 DEBUG nova.objects.instance [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'numa_topology' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.1508] manager: (tap3b01b1bc-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 22 17:45:02 np0005592767 kernel: tap3b01b1bc-ac: entered promiscuous mode
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00635|binding|INFO|Claiming lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for this chassis.
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00636|binding|INFO|3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03: Claiming fa:16:3e:8c:dd:9c 10.100.0.14
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.1644] device (tap3b01b1bc-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.1652] device (tap3b01b1bc-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.168 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:dd:9c 10.100.0.14'], port_security=['fa:16:3e:8c:dd:9c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '221039e7-b475-4211-93ed-ba13c9108ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58149591-08d1-41df-aff9-e407627baa5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4c5c45ae-40a8-4bb8-a1ee-71fd2a465240', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cad567a3-7471-409d-9c78-062230502d26, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.169 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 in datapath 58149591-08d1-41df-aff9-e407627baa5e bound to our chassis#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.171 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58149591-08d1-41df-aff9-e407627baa5e#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.183 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b284c6db-653e-485f-9bab-952d8d28319e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.184 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58149591-01 in ovnmeta-58149591-08d1-41df-aff9-e407627baa5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.185 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58149591-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.185 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d98b0983-40a5-41ed-98a0-52ab2fb0c7e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.187 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c21058c8-f592-45ef-9559-1626c19b29c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00637|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 ovn-installed in OVS
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00638|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 up in Southbound
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.193 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.199 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc539fc-7e0b-4f22-b8a7-5d3c604a03df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 systemd-machined[153912]: New machine qemu-79-instance-00000096.
Jan 22 17:45:02 np0005592767 systemd[1]: Started Virtual Machine qemu-79-instance-00000096.
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.225 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6460bb-82ca-45cb-9a31-c35b57e603c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 kernel: tapa0cc4fb3-f0: entered promiscuous mode
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.2426] manager: (tapa0cc4fb3-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.243 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00639|binding|INFO|Claiming lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for this chassis.
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00640|binding|INFO|a0cc4fb3-f017-4200-ae1a-59c0f99b60d0: Claiming fa:16:3e:69:52:45 10.100.0.5
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.2529] device (tapa0cc4fb3-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.2534] device (tapa0cc4fb3-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.255 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.256 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.256 182627 DEBUG nova.network.neutron [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.256 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.257 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00641|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 ovn-installed in OVS
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00642|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 up in Southbound
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.259 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.262 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.262 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0a76f2b3-f036-4a88-af10-1d26f21565e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.270 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d072f6fb-e7a4-4fb8-8530-6873cd829b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.2712] manager: (tap58149591-00): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Jan 22 17:45:02 np0005592767 systemd-machined[153912]: New machine qemu-80-instance-00000095.
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.296 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[990d4746-1c2e-48b8-a011-a6c96bd2fef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.299 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2077f6-930f-4e59-af94-0f1bb27e3fef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 systemd[1]: Started Virtual Machine qemu-80-instance-00000095.
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.3179] device (tap58149591-00): carrier: link connected
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.322 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c350f30b-79e3-4526-a9b4-0e74ae56552c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.340 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9367aa45-bb03-4b4a-9543-faa507e21d67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58149591-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fa:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549889, 'reachable_time': 36951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234261, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.355 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61f9791a-6f12-4cf2-8a43-526f448504c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:fad9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549889, 'tstamp': 549889}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234267, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.371 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b0844c6a-f4f5-4561-bdb4-a95526801b38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58149591-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fa:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549889, 'reachable_time': 36951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234268, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.399 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[017f59d0-9393-48c0-bfb5-fb54ac0a1f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.446 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e42865-0d1f-4aee-8745-1ea4b59f21a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.447 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58149591-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.448 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.448 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58149591-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:02 np0005592767 NetworkManager[54973]: <info>  [1769121902.4508] manager: (tap58149591-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 22 17:45:02 np0005592767 kernel: tap58149591-00: entered promiscuous mode
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.450 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.453 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.458 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58149591-00, col_values=(('external_ids', {'iface-id': '23833537-3aba-41e1-a81b-33ed3a3af74c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.459 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:02Z|00643|binding|INFO|Releasing lport 23833537-3aba-41e1-a81b-33ed3a3af74c from this chassis (sb_readonly=0)
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.462 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.463 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7095a9c6-2f20-45cb-a0be-62b8623e13c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.464 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-58149591-08d1-41df-aff9-e407627baa5e
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/58149591-08d1-41df-aff9-e407627baa5e.pid.haproxy
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 58149591-08d1-41df-aff9-e407627baa5e
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:45:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:02.464 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'env', 'PROCESS_TAG=haproxy-58149591-08d1-41df-aff9-e407627baa5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58149591-08d1-41df-aff9-e407627baa5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.472 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.687 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 221039e7-b475-4211-93ed-ba13c9108ed0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.687 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121902.6869984, 221039e7-b475-4211-93ed-ba13c9108ed0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.687 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Started (Lifecycle Event)#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.705 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.718 182627 DEBUG nova.compute.manager [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.719 182627 DEBUG nova.objects.instance [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.721 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.738 182627 INFO nova.virt.libvirt.driver [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance running successfully.#033[00m
Jan 22 17:45:02 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.740 182627 DEBUG nova.virt.libvirt.guest [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.740 182627 DEBUG nova.compute.manager [None req-c75de67d-886f-40d6-a460-0a2729b829ba 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.745 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.746 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121902.6936595, 221039e7-b475-4211-93ed-ba13c9108ed0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.746 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.765 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.768 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.790 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:45:02 np0005592767 podman[234308]: 2026-01-22 22:45:02.873602406 +0000 UTC m=+0.052044593 container create 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:45:02 np0005592767 systemd[1]: Started libpod-conmon-10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236.scope.
Jan 22 17:45:02 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.922 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for f8123605-8922-47fd-b7ac-fba5cfac36d4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.922 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121902.9220939, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.923 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:45:02 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bb44caad8abeeba69af016cb4746db5e39bc0b468040497c391d26ecc9885cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.927 182627 DEBUG nova.compute.manager [None req-9eb34ad1-ab19-429c-84db-ece7a398e7ef 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:02 np0005592767 podman[234308]: 2026-01-22 22:45:02.848737373 +0000 UTC m=+0.027179580 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:45:02 np0005592767 podman[234308]: 2026-01-22 22:45:02.943105412 +0000 UTC m=+0.121547649 container init 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.943 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.946 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:02 np0005592767 podman[234308]: 2026-01-22 22:45:02.948764492 +0000 UTC m=+0.127206679 container start 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:45:02 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [NOTICE]   (234331) : New worker (234333) forked
Jan 22 17:45:02 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [NOTICE]   (234331) : Loading success.
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.988 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.989 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121902.9262054, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:02 np0005592767 nova_compute[182623]: 2026-01-22 22:45:02.989 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Started (Lifecycle Event)#033[00m
Jan 22 17:45:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:03.016 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:45:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:03.017 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:45:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:03.018 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18ce562f-dd62-4e43-be44-dc6af305285c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.019 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.025 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.514 182627 INFO nova.network.neutron [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Port 58e15b42-1139-4a64-ba76-2af3eca46aa1 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.515 182627 DEBUG nova.network.neutron [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.534 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.571 182627 DEBUG oslo_concurrency.lockutils [None req-ae75b229-c363-4b7d-b1a4-db8748050fd5 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "interface-1f55de0e-e258-4f65-a0e0-f26bebf85ccb-58e15b42-1139-4a64-ba76-2af3eca46aa1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:03Z|00644|binding|INFO|Releasing lport 23833537-3aba-41e1-a81b-33ed3a3af74c from this chassis (sb_readonly=0)
Jan 22 17:45:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:03Z|00645|binding|INFO|Releasing lport 93ed692b-12b1-4a5e-af78-c346b15d7d6e from this chassis (sb_readonly=0)
Jan 22 17:45:03 np0005592767 nova_compute[182623]: 2026-01-22 22:45:03.735 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.083 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.084 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.084 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.085 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.085 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.085 182627 WARNING nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-plugged-58e15b42-1139-4a64-ba76-2af3eca46aa1 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.086 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-deleted-58e15b42-1139-4a64-ba76-2af3eca46aa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.086 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.087 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.087 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.088 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.088 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.089 182627 WARNING nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.089 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.089 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.090 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.090 182627 DEBUG oslo_concurrency.lockutils [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.091 182627 DEBUG nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.091 182627 WARNING nova.compute.manager [req-47954b50-a1fc-441d-a518-13a76ac9fa6b req-bbbc3857-62ce-4b36-bf19-39d5a096e94a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.227 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.228 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.228 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.229 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.229 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.230 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.230 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.230 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.231 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.231 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.231 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.232 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.233 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.233 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.233 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.234 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.234 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.234 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.235 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.235 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.235 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.236 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.236 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.236 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.237 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.237 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.238 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.238 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.238 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.239 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.239 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.239 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.240 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.240 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.241 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.241 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.241 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.242 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.242 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.243 182627 DEBUG oslo_concurrency.lockutils [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.243 182627 DEBUG nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.243 182627 WARNING nova.compute.manager [req-0788e216-1e8c-452d-bb6d-8c6f0d84aa8a req-19833129-dfa8-4f0b-a4c8-adef2a41d032 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.295 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.296 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.297 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.297 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.298 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.318 182627 INFO nova.compute.manager [None req-6ba9a764-eb33-44aa-ab5a-a3c5ff813a16 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Get console output#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.319 182627 INFO nova.compute.manager [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Terminating instance#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.338 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.341 182627 DEBUG nova.compute.manager [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:45:04 np0005592767 kernel: tapa0cc4fb3-f0 (unregistering): left promiscuous mode
Jan 22 17:45:04 np0005592767 NetworkManager[54973]: <info>  [1769121904.3700] device (tapa0cc4fb3-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.430 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00646|binding|INFO|Releasing lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 from this chassis (sb_readonly=0)
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00647|binding|INFO|Setting lport a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 down in Southbound
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00648|binding|INFO|Removing iface tapa0cc4fb3-f0 ovn-installed in OVS
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.433 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.437 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:52:45 10.100.0.5'], port_security=['fa:16:3e:69:52:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f8123605-8922-47fd-b7ac-fba5cfac36d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4fdcd9f-134c-4fe1-8a9d-eaab63006166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c005f10296264b39a882736d172d2b47', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'baf61c3a-2e16-474b-ac51-516a3d297119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=328cd9d9-5b75-488d-aa66-d39201d677fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.438 104135 INFO neutron.agent.ovn.metadata.agent [-] Port a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 in datapath c4fdcd9f-134c-4fe1-8a9d-eaab63006166 unbound from our chassis#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.439 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4fdcd9f-134c-4fe1-8a9d-eaab63006166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.440 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb72cb91-a98f-4387-a070-312c00649331]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.444 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000095.scope: Deactivated successfully.
Jan 22 17:45:04 np0005592767 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000095.scope: Consumed 2.100s CPU time.
Jan 22 17:45:04 np0005592767 systemd-machined[153912]: Machine qemu-80-instance-00000095 terminated.
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.478 182627 DEBUG nova.compute.manager [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.478 182627 DEBUG nova.compute.manager [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing instance network info cache due to event network-changed-4a077200-6d1a-4174-ba2c-090123ed6b58. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.478 182627 DEBUG oslo_concurrency.lockutils [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.478 182627 DEBUG oslo_concurrency.lockutils [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.478 182627 DEBUG nova.network.neutron [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Refreshing network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:04 np0005592767 podman[234343]: 2026-01-22 22:45:04.497313734 +0000 UTC m=+0.064561557 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.607 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.608 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.608 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.609 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.609 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.625 182627 INFO nova.virt.libvirt.driver [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Instance destroyed successfully.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.625 182627 DEBUG nova.objects.instance [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lazy-loading 'resources' on Instance uuid f8123605-8922-47fd-b7ac-fba5cfac36d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.628 182627 INFO nova.compute.manager [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Terminating instance#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.640 182627 DEBUG nova.virt.libvirt.vif [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:44:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-551865481',display_name='tempest-ServerRescueTestJSON-server-551865481',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-551865481',id=149,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c005f10296264b39a882736d172d2b47',ramdisk_id='',reservation_id='r-paf6o9sg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-697248807',owner_user_name='tempest-ServerRescueTestJSON-697248807-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:02Z,user_data=None,user_id='21487f95977a444e83139b6e5faf83ce',uuid=f8123605-8922-47fd-b7ac-fba5cfac36d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.641 182627 DEBUG nova.network.os_vif_util [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converting VIF {"id": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "address": "fa:16:3e:69:52:45", "network": {"id": "c4fdcd9f-134c-4fe1-8a9d-eaab63006166", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1371169293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c005f10296264b39a882736d172d2b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0cc4fb3-f0", "ovs_interfaceid": "a0cc4fb3-f017-4200-ae1a-59c0f99b60d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.642 182627 DEBUG nova.network.os_vif_util [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.642 182627 DEBUG os_vif [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.645 182627 DEBUG nova.compute.manager [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.646 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.646 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0cc4fb3-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.649 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.651 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.653 182627 INFO os_vif [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:52:45,bridge_name='br-int',has_traffic_filtering=True,id=a0cc4fb3-f017-4200-ae1a-59c0f99b60d0,network=Network(c4fdcd9f-134c-4fe1-8a9d-eaab63006166),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0cc4fb3-f0')#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.654 182627 INFO nova.virt.libvirt.driver [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Deleting instance files /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4_del#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.655 182627 INFO nova.virt.libvirt.driver [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Deletion of /var/lib/nova/instances/f8123605-8922-47fd-b7ac-fba5cfac36d4_del complete#033[00m
Jan 22 17:45:04 np0005592767 kernel: tap4a077200-6d (unregistering): left promiscuous mode
Jan 22 17:45:04 np0005592767 NetworkManager[54973]: <info>  [1769121904.6842] device (tap4a077200-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00649|binding|INFO|Releasing lport 4a077200-6d1a-4174-ba2c-090123ed6b58 from this chassis (sb_readonly=0)
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00650|binding|INFO|Setting lport 4a077200-6d1a-4174-ba2c-090123ed6b58 down in Southbound
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.691 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:04Z|00651|binding|INFO|Removing iface tap4a077200-6d ovn-installed in OVS
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.702 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:c1:ef 10.100.0.3'], port_security=['fa:16:3e:48:c1:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1f55de0e-e258-4f65-a0e0-f26bebf85ccb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b8224f0-0e08-4065-b940-1530a6a30708', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6c3e3c74-0fd6-4ae4-95ed-b97b1894cf2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c373b96b-a79e-44de-a1da-4f3934614dac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4a077200-6d1a-4174-ba2c-090123ed6b58) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.703 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4a077200-6d1a-4174-ba2c-090123ed6b58 in datapath 9b8224f0-0e08-4065-b940-1530a6a30708 unbound from our chassis#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.704 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b8224f0-0e08-4065-b940-1530a6a30708, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.705 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5aa0cb-00ba-443b-aa66-3c6b637b3a40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:04.706 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708 namespace which is not needed anymore#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.711 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 22 17:45:04 np0005592767 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Consumed 17.223s CPU time.
Jan 22 17:45:04 np0005592767 systemd-machined[153912]: Machine qemu-74-instance-00000091 terminated.
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.781 182627 INFO nova.compute.manager [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.782 182627 DEBUG oslo.service.loopingcall [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.782 182627 DEBUG nova.compute.manager [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.783 182627 DEBUG nova.network.neutron [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [NOTICE]   (232952) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [NOTICE]   (232952) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [WARNING]  (232952) : Exiting Master process...
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [WARNING]  (232952) : Exiting Master process...
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [ALERT]    (232952) : Current worker (232954) exited with code 143 (Terminated)
Jan 22 17:45:04 np0005592767 neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708[232948]: [WARNING]  (232952) : All workers exited. Exiting... (0)
Jan 22 17:45:04 np0005592767 systemd[1]: libpod-5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5.scope: Deactivated successfully.
Jan 22 17:45:04 np0005592767 podman[234407]: 2026-01-22 22:45:04.87934566 +0000 UTC m=+0.068696604 container died 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.917 182627 INFO nova.virt.libvirt.driver [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance destroyed successfully.#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.918 182627 DEBUG nova.objects.instance [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid 1f55de0e-e258-4f65-a0e0-f26bebf85ccb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:04 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:04 np0005592767 systemd[1]: var-lib-containers-storage-overlay-26edfef5876d025259af13b4d0d144dd099a17640c04e7537764de2a0bdf9a15-merged.mount: Deactivated successfully.
Jan 22 17:45:04 np0005592767 podman[234407]: 2026-01-22 22:45:04.930572849 +0000 UTC m=+0.119923753 container cleanup 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.934 182627 DEBUG nova.virt.libvirt.vif [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1043176814',display_name='tempest-TestNetworkBasicOps-server-1043176814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1043176814',id=145,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJKv5nY3CjmGfNT6B/gpPzbjE89ugfijP7xjsIi8SwE+Wk4m0lVirbfrck91h4aZAO9evKrblzqraFcCEdv736hYfKg//l5lI5mOPW+VndJ+6BDevIZqRh3pCBaesVtehQ==',key_name='tempest-TestNetworkBasicOps-579539433',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:43:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-m3u3vn79',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:43:24Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=1f55de0e-e258-4f65-a0e0-f26bebf85ccb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.935 182627 DEBUG nova.network.os_vif_util [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.936 182627 DEBUG nova.network.os_vif_util [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.937 182627 DEBUG os_vif [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.938 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.939 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a077200-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.941 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:04 np0005592767 systemd[1]: libpod-conmon-5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5.scope: Deactivated successfully.
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.943 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.945 182627 INFO os_vif [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:c1:ef,bridge_name='br-int',has_traffic_filtering=True,id=4a077200-6d1a-4174-ba2c-090123ed6b58,network=Network(9b8224f0-0e08-4065-b940-1530a6a30708),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a077200-6d')#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.946 182627 INFO nova.virt.libvirt.driver [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Deleting instance files /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb_del#033[00m
Jan 22 17:45:04 np0005592767 nova_compute[182623]: 2026-01-22 22:45:04.947 182627 INFO nova.virt.libvirt.driver [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Deletion of /var/lib/nova/instances/1f55de0e-e258-4f65-a0e0-f26bebf85ccb_del complete#033[00m
Jan 22 17:45:05 np0005592767 podman[234456]: 2026-01-22 22:45:05.017600891 +0000 UTC m=+0.052422874 container remove 5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.017 182627 INFO nova.compute.manager [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.018 182627 DEBUG oslo.service.loopingcall [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.018 182627 DEBUG nova.compute.manager [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.018 182627 DEBUG nova.network.neutron [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.025 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fbdd5f19-462b-456b-bf63-efe56115b94e]: (4, ('Thu Jan 22 10:45:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708 (5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5)\n5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5\nThu Jan 22 10:45:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708 (5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5)\n5948f22d90cb89c4b4c4547f5a1bae6f0f83bbcbcbfe91785e84120fdb2d1ab5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.027 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[688e432c-049b-45f4-a90f-f72a14f58095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.028 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b8224f0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:05 np0005592767 kernel: tap9b8224f0-00: left promiscuous mode
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.050 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[275b3142-44d4-41dc-a9da-6c73eea1398f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.064 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[320cfa0c-24df-457a-8885-6b9ce35ac67d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.065 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[08d9e97e-53dc-4ef4-b9de-494572196f7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.085 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[03925d18-54ea-4c2c-868f-85fa2de46fda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539949, 'reachable_time': 26205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234471, 'error': None, 'target': 'ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 systemd[1]: run-netns-ovnmeta\x2d9b8224f0\x2d0e08\x2d4065\x2db940\x2d1530a6a30708.mount: Deactivated successfully.
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.090 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b8224f0-0e08-4065-b940-1530a6a30708 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.091 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[78e3b5f3-6c12-417f-8fcc-74ebfe2faaad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.640 182627 DEBUG nova.network.neutron [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.654 182627 INFO nova.compute.manager [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Took 0.64 seconds to deallocate network for instance.#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.664 182627 DEBUG nova.network.neutron [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.686 182627 INFO nova.compute.manager [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Took 0.90 seconds to deallocate network for instance.#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.744 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.745 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.745 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.746 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.746 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.750 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.751 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.770 182627 INFO nova.compute.manager [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Terminating instance#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.779 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.789 182627 DEBUG nova.compute.manager [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:45:05 np0005592767 kernel: tap3b01b1bc-ac (unregistering): left promiscuous mode
Jan 22 17:45:05 np0005592767 NetworkManager[54973]: <info>  [1769121905.8146] device (tap3b01b1bc-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:05Z|00652|binding|INFO|Releasing lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 from this chassis (sb_readonly=0)
Jan 22 17:45:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:05Z|00653|binding|INFO|Setting lport 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 down in Southbound
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.822 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:05Z|00654|binding|INFO|Removing iface tap3b01b1bc-ac ovn-installed in OVS
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.834 182627 DEBUG nova.network.neutron [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updated VIF entry in instance network info cache for port 4a077200-6d1a-4174-ba2c-090123ed6b58. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.835 182627 DEBUG nova.network.neutron [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Updating instance_info_cache with network_info: [{"id": "4a077200-6d1a-4174-ba2c-090123ed6b58", "address": "fa:16:3e:48:c1:ef", "network": {"id": "9b8224f0-0e08-4065-b940-1530a6a30708", "bridge": "br-int", "label": "tempest-network-smoke--1851880372", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a077200-6d", "ovs_interfaceid": "4a077200-6d1a-4174-ba2c-090123ed6b58", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.835 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:dd:9c 10.100.0.14'], port_security=['fa:16:3e:8c:dd:9c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '221039e7-b475-4211-93ed-ba13c9108ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58149591-08d1-41df-aff9-e407627baa5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '839eb51e89b14157b8da40ae1b480ef3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4c5c45ae-40a8-4bb8-a1ee-71fd2a465240', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cad567a3-7471-409d-9c78-062230502d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.838 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 in datapath 58149591-08d1-41df-aff9-e407627baa5e unbound from our chassis#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.840 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58149591-08d1-41df-aff9-e407627baa5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.841 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f51f3232-1478-4d33-8899-29c23ea46f7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:05.842 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58149591-08d1-41df-aff9-e407627baa5e namespace which is not needed anymore#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.843 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.857 182627 DEBUG oslo_concurrency.lockutils [req-b6df0350-f7e5-4577-99b2-34d894749a55 req-e271f460-c885-44e2-84f7-ee7d3757135a 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-1f55de0e-e258-4f65-a0e0-f26bebf85ccb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:05 np0005592767 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 22 17:45:05 np0005592767 systemd-machined[153912]: Machine qemu-79-instance-00000096 terminated.
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.941 182627 DEBUG nova.compute.provider_tree [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.956 182627 DEBUG nova.scheduler.client.report [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:05 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [NOTICE]   (234331) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:05 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [NOTICE]   (234331) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:05 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [WARNING]  (234331) : Exiting Master process...
Jan 22 17:45:05 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [ALERT]    (234331) : Current worker (234333) exited with code 143 (Terminated)
Jan 22 17:45:05 np0005592767 neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e[234327]: [WARNING]  (234331) : All workers exited. Exiting... (0)
Jan 22 17:45:05 np0005592767 systemd[1]: libpod-10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236.scope: Deactivated successfully.
Jan 22 17:45:05 np0005592767 podman[234493]: 2026-01-22 22:45:05.976806273 +0000 UTC m=+0.050150549 container died 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.980 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:05 np0005592767 nova_compute[182623]: 2026-01-22 22:45:05.983 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 NetworkManager[54973]: <info>  [1769121906.0073] manager: (tap3b01b1bc-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.008 182627 INFO nova.scheduler.client.report [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb#033[00m
Jan 22 17:45:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:06 np0005592767 systemd[1]: var-lib-containers-storage-overlay-7bb44caad8abeeba69af016cb4746db5e39bc0b468040497c391d26ecc9885cb-merged.mount: Deactivated successfully.
Jan 22 17:45:06 np0005592767 podman[234493]: 2026-01-22 22:45:06.016794494 +0000 UTC m=+0.090138770 container cleanup 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:45:06 np0005592767 systemd[1]: libpod-conmon-10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236.scope: Deactivated successfully.
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.059 182627 DEBUG nova.compute.provider_tree [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.064 182627 INFO nova.virt.libvirt.driver [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Instance destroyed successfully.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.065 182627 DEBUG nova.objects.instance [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lazy-loading 'resources' on Instance uuid 221039e7-b475-4211-93ed-ba13c9108ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:06 np0005592767 podman[234532]: 2026-01-22 22:45:06.084679975 +0000 UTC m=+0.046168237 container remove 10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.084 182627 DEBUG nova.virt.libvirt.vif [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:44:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-307044385',display_name='tempest-TestNetworkAdvancedServerOps-server-307044385',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-307044385',id=150,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOzPJfrObpNtHBfp/69vKXuFKjly5i5dFID0PcAeqQJDLNKSyZcYfO4zUUcKFCDAJBRfy8EAIOgR6Q47M2V1QqINGBnb52Cjc6aowh8v2aT2SOkhP9/GuA6sCTfRCeMlnA==',key_name='tempest-TestNetworkAdvancedServerOps-55847567',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:44:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='839eb51e89b14157b8da40ae1b480ef3',ramdisk_id='',reservation_id='r-zmc7yapg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1664122663',owner_user_name='tempest-TestNetworkAdvancedServerOps-1664122663-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:02Z,user_data=None,user_id='80fc173d19874dafa5e0cbd18c7ccf24',uuid=221039e7-b475-4211-93ed-ba13c9108ed0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.085 182627 DEBUG nova.network.os_vif_util [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converting VIF {"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.086 182627 DEBUG nova.network.os_vif_util [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.086 182627 DEBUG os_vif [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.090 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[240cc706-ef6b-4038-84ce-6cf373caf66c]: (4, ('Thu Jan 22 10:45:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e (10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236)\n10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236\nThu Jan 22 10:45:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58149591-08d1-41df-aff9-e407627baa5e (10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236)\n10d5a2b528cf28fcc71ff61f0428510b2bc0d1e249b8e59a90a4d2c0d2258236\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.094 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3ca7e8-fb30-4f6a-8d5f-b06eeccf139f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.097 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58149591-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.097 182627 DEBUG nova.scheduler.client.report [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:06 np0005592767 kernel: tap58149591-00: left promiscuous mode
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.105 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b01b1bc-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.119 182627 DEBUG oslo_concurrency.lockutils [None req-dfb51a30-8e8a-4f2f-abc8-690f31d7f9fc b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.120 182627 INFO os_vif [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:dd:9c,bridge_name='br-int',has_traffic_filtering=True,id=3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03,network=Network(58149591-08d1-41df-aff9-e407627baa5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b01b1bc-ac')#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.121 182627 INFO nova.virt.libvirt.driver [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Deleting instance files /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0_del#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.122 182627 INFO nova.virt.libvirt.driver [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Deletion of /var/lib/nova/instances/221039e7-b475-4211-93ed-ba13c9108ed0_del complete#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.121 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2259d4-7119-4934-b09f-51b5e92d4e31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.130 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.136 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[625f11fa-1cd1-4606-abc5-caa1b661c1c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb258d54-e634-4f1a-9aca-4935fa72d644]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.149 182627 INFO nova.scheduler.client.report [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Deleted allocations for instance f8123605-8922-47fd-b7ac-fba5cfac36d4#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.163 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a499da55-4945-4dde-92ab-745990dd0163]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549883, 'reachable_time': 37909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234554, 'error': None, 'target': 'ovnmeta-58149591-08d1-41df-aff9-e407627baa5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 systemd[1]: run-netns-ovnmeta\x2d58149591\x2d08d1\x2d41df\x2daff9\x2de407627baa5e.mount: Deactivated successfully.
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.167 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58149591-08d1-41df-aff9-e407627baa5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:06.168 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[72f1aa28-5610-4001-bbc5-7013cdd6f80c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.180 182627 DEBUG nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.181 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.181 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.182 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.182 182627 DEBUG nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.182 182627 DEBUG nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-unplugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.182 182627 DEBUG nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.183 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.183 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.183 182627 DEBUG oslo_concurrency.lockutils [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.183 182627 DEBUG nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] No waiting events found dispatching network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.183 182627 WARNING nova.compute.manager [req-af39a1f9-7662-4116-9d4d-b7dea975eac0 req-db94eae1-1478-4377-b33c-4021ad6fe0c5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received unexpected event network-vif-plugged-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.213 182627 INFO nova.compute.manager [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.214 182627 DEBUG oslo.service.loopingcall [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.215 182627 DEBUG nova.compute.manager [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.215 182627 DEBUG nova.network.neutron [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.241 182627 DEBUG oslo_concurrency.lockutils [None req-2158c899-593b-4376-a35e-e8c09efd1775 21487f95977a444e83139b6e5faf83ce c005f10296264b39a882736d172d2b47 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.309 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.309 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.310 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.310 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.311 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.311 182627 WARNING nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-unplugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.311 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.312 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.312 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.313 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "f8123605-8922-47fd-b7ac-fba5cfac36d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.313 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] No waiting events found dispatching network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.313 182627 WARNING nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received unexpected event network-vif-plugged-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.314 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.314 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing instance network info cache due to event network-changed-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.315 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.315 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.316 182627 DEBUG nova.network.neutron [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Refreshing network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.589 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-unplugged-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.589 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.590 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.590 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.591 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-unplugged-4a077200-6d1a-4174-ba2c-090123ed6b58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.591 182627 WARNING nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-unplugged-4a077200-6d1a-4174-ba2c-090123ed6b58 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.592 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.592 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.593 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.593 182627 DEBUG oslo_concurrency.lockutils [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "1f55de0e-e258-4f65-a0e0-f26bebf85ccb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.594 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] No waiting events found dispatching network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.594 182627 WARNING nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received unexpected event network-vif-plugged-4a077200-6d1a-4174-ba2c-090123ed6b58 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.594 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Received event network-vif-deleted-4a077200-6d1a-4174-ba2c-090123ed6b58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.595 182627 INFO nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Neutron deleted interface 4a077200-6d1a-4174-ba2c-090123ed6b58; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.595 182627 DEBUG nova.network.neutron [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 17:45:06 np0005592767 nova_compute[182623]: 2026-01-22 22:45:06.598 182627 DEBUG nova.compute.manager [req-b85fe77c-eed8-48ab-97ba-fa00cb6156f8 req-2a4ff1f5-4f4a-4303-971e-6a7cc519e96b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Detach interface failed, port_id=4a077200-6d1a-4174-ba2c-090123ed6b58, reason: Instance 1f55de0e-e258-4f65-a0e0-f26bebf85ccb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.194 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.335 182627 DEBUG nova.network.neutron [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.353 182627 INFO nova.compute.manager [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.428 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.428 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.469 182627 DEBUG nova.compute.provider_tree [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.486 182627 DEBUG nova.scheduler.client.report [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.504 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.525 182627 INFO nova.scheduler.client.report [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Deleted allocations for instance 221039e7-b475-4211-93ed-ba13c9108ed0#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.599 182627 DEBUG oslo_concurrency.lockutils [None req-5255c687-8b90-497d-9839-5c1af1c35f1b 80fc173d19874dafa5e0cbd18c7ccf24 839eb51e89b14157b8da40ae1b480ef3 - - default default] Lock "221039e7-b475-4211-93ed-ba13c9108ed0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.691 182627 DEBUG nova.network.neutron [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updated VIF entry in instance network info cache for port 3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.691 182627 DEBUG nova.network.neutron [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Updating instance_info_cache with network_info: [{"id": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "address": "fa:16:3e:8c:dd:9c", "network": {"id": "58149591-08d1-41df-aff9-e407627baa5e", "bridge": "br-int", "label": "tempest-network-smoke--428899859", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "839eb51e89b14157b8da40ae1b480ef3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b01b1bc-ac", "ovs_interfaceid": "3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.719 182627 DEBUG oslo_concurrency.lockutils [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-221039e7-b475-4211-93ed-ba13c9108ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:07 np0005592767 nova_compute[182623]: 2026-01-22 22:45:07.720 182627 DEBUG nova.compute.manager [req-727305e7-ed0a-4a80-af26-17b94baf7ac6 req-6cf2416b-5903-4efa-bc6d-fba5067e32cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Received event network-vif-deleted-a0cc4fb3-f017-4200-ae1a-59c0f99b60d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:08 np0005592767 nova_compute[182623]: 2026-01-22 22:45:08.274 182627 DEBUG nova.compute.manager [req-504386ad-2321-41b1-bc88-099f5a231d9f req-af03b695-a299-480c-923c-8f16303bf913 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Received event network-vif-deleted-3b01b1bc-ac1d-4f10-9bd1-48e5bee50b03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:09 np0005592767 podman[234556]: 2026-01-22 22:45:09.148040074 +0000 UTC m=+0.059782842 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 17:45:09 np0005592767 podman[234555]: 2026-01-22 22:45:09.157869252 +0000 UTC m=+0.075694252 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:45:10 np0005592767 nova_compute[182623]: 2026-01-22 22:45:10.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:11 np0005592767 nova_compute[182623]: 2026-01-22 22:45:11.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:11 np0005592767 nova_compute[182623]: 2026-01-22 22:45:11.110 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.015 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.016 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.033 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:45:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:12.116 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.116 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.117 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:12.117 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.126 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.126 182627 INFO nova.compute.claims [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.234 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.343 182627 DEBUG nova.compute.provider_tree [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.362 182627 DEBUG nova.scheduler.client.report [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.389 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.390 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.452 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.453 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.468 182627 INFO nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.491 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.602 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.603 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.604 182627 INFO nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Creating image(s)#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.604 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.605 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.606 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.621 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.686 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.689 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.690 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.704 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.792 182627 DEBUG nova.policy [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.798 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.799 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.853 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.854 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.854 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.923 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.924 182627 DEBUG nova.virt.disk.api [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.924 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.984 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.986 182627 DEBUG nova.virt.disk.api [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:45:12 np0005592767 nova_compute[182623]: 2026-01-22 22:45:12.987 182627 DEBUG nova.objects.instance [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid bc03363d-d5c8-41bf-b821-89de00c02b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:14 np0005592767 nova_compute[182623]: 2026-01-22 22:45:14.124 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:45:14 np0005592767 nova_compute[182623]: 2026-01-22 22:45:14.126 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Ensure instance console log exists: /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:45:14 np0005592767 nova_compute[182623]: 2026-01-22 22:45:14.127 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:14 np0005592767 nova_compute[182623]: 2026-01-22 22:45:14.128 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:14 np0005592767 nova_compute[182623]: 2026-01-22 22:45:14.128 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:15 np0005592767 nova_compute[182623]: 2026-01-22 22:45:15.791 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Successfully created port: 0ed65493-a756-4d1d-88c3-edf23728b4e9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:45:16 np0005592767 nova_compute[182623]: 2026-01-22 22:45:16.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:16 np0005592767 nova_compute[182623]: 2026-01-22 22:45:16.479 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Successfully created port: ebd087e9-f858-41e2-a292-74b1525897a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.236 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.550 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Successfully updated port: 0ed65493-a756-4d1d-88c3-edf23728b4e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.705 182627 DEBUG nova.compute.manager [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.706 182627 DEBUG nova.compute.manager [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing instance network info cache due to event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.706 182627 DEBUG oslo_concurrency.lockutils [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.707 182627 DEBUG oslo_concurrency.lockutils [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:17 np0005592767 nova_compute[182623]: 2026-01-22 22:45:17.708 182627 DEBUG nova.network.neutron [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing network info cache for port 0ed65493-a756-4d1d-88c3-edf23728b4e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:18 np0005592767 podman[234617]: 2026-01-22 22:45:18.149916472 +0000 UTC m=+0.066393609 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:45:18 np0005592767 podman[234616]: 2026-01-22 22:45:18.162012654 +0000 UTC m=+0.074010545 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.190 182627 DEBUG nova.network.neutron [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.562 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Successfully updated port: ebd087e9-f858-41e2-a292-74b1525897a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.576 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.624 182627 DEBUG nova.network.neutron [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.641 182627 DEBUG oslo_concurrency.lockutils [req-ffb193e6-27d3-4d1b-b65f-30b70bf91453 req-e2f64a4a-fa03-4f10-a0da-bc93641ec4c1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.643 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.643 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:45:18 np0005592767 nova_compute[182623]: 2026-01-22 22:45:18.900 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.615 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121904.6141815, f8123605-8922-47fd-b7ac-fba5cfac36d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.616 182627 INFO nova.compute.manager [-] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.640 182627 DEBUG nova.compute.manager [None req-8288b406-cec4-4173-9e1b-74d2f7e4749c - - - - - -] [instance: f8123605-8922-47fd-b7ac-fba5cfac36d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.846 182627 DEBUG nova.compute.manager [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-changed-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.846 182627 DEBUG nova.compute.manager [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing instance network info cache due to event network-changed-ebd087e9-f858-41e2-a292-74b1525897a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.847 182627 DEBUG oslo_concurrency.lockutils [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.917 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121904.9158025, 1f55de0e-e258-4f65-a0e0-f26bebf85ccb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.918 182627 INFO nova.compute.manager [-] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:45:19 np0005592767 nova_compute[182623]: 2026-01-22 22:45:19.993 182627 DEBUG nova.compute.manager [None req-e0f4eabb-1b3e-4c48-90d6-754edfe59d56 - - - - - -] [instance: 1f55de0e-e258-4f65-a0e0-f26bebf85ccb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.045 182627 DEBUG nova.network.neutron [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.062 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121906.0586364, 221039e7-b475-4211-93ed-ba13c9108ed0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.062 182627 INFO nova.compute.manager [-] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.078 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.078 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance network_info: |[{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.079 182627 DEBUG oslo_concurrency.lockutils [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.079 182627 DEBUG nova.network.neutron [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing network info cache for port ebd087e9-f858-41e2-a292-74b1525897a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.083 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Start _get_guest_xml network_info=[{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.090 182627 WARNING nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.098 182627 DEBUG nova.virt.libvirt.host [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.099 182627 DEBUG nova.virt.libvirt.host [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.104 182627 DEBUG nova.virt.libvirt.host [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.105 182627 DEBUG nova.virt.libvirt.host [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.107 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.107 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.108 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.108 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.109 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.109 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.110 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.111 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.111 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.112 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.112 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.113 182627 DEBUG nova.virt.hardware [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.119 182627 DEBUG nova.virt.libvirt.vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.120 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.121 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.123 182627 DEBUG nova.virt.libvirt.vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.123 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.125 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.126 182627 DEBUG nova.objects.instance [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid bc03363d-d5c8-41bf-b821-89de00c02b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.128 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.132 182627 DEBUG nova.compute.manager [None req-560b78d6-4f4d-4b83-ba0e-ea3d429ac0c9 - - - - - -] [instance: 221039e7-b475-4211-93ed-ba13c9108ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.149 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <uuid>bc03363d-d5c8-41bf-b821-89de00c02b83</uuid>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <name>instance-00000098</name>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestGettingAddress-server-963547908</nova:name>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:45:21</nova:creationTime>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:port uuid="0ed65493-a756-4d1d-88c3-edf23728b4e9">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        <nova:port uuid="ebd087e9-f858-41e2-a292-74b1525897a7">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe02:7b69" ipVersion="6"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="serial">bc03363d-d5c8-41bf-b821-89de00c02b83</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="uuid">bc03363d-d5c8-41bf-b821-89de00c02b83</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.config"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:42:51:c2"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <target dev="tap0ed65493-a7"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:02:7b:69"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <target dev="tapebd087e9-f8"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/console.log" append="off"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:45:21 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:45:21 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:45:21 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:45:21 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.150 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Preparing to wait for external event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.151 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.152 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.152 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.153 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Preparing to wait for external event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.153 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.154 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.154 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.155 182627 DEBUG nova.virt.libvirt.vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.156 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.157 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.158 182627 DEBUG os_vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.160 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.160 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.165 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ed65493-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.166 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ed65493-a7, col_values=(('external_ids', {'iface-id': '0ed65493-a756-4d1d-88c3-edf23728b4e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:51:c2', 'vm-uuid': 'bc03363d-d5c8-41bf-b821-89de00c02b83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.1690] manager: (tap0ed65493-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.171 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.175 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.177 182627 INFO os_vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7')#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.178 182627 DEBUG nova.virt.libvirt.vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.179 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.180 182627 DEBUG nova.network.os_vif_util [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.181 182627 DEBUG os_vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.182 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.182 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.183 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.186 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.187 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebd087e9-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.187 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebd087e9-f8, col_values=(('external_ids', {'iface-id': 'ebd087e9-f858-41e2-a292-74b1525897a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:7b:69', 'vm-uuid': 'bc03363d-d5c8-41bf-b821-89de00c02b83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.189 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.1907] manager: (tapebd087e9-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.193 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.200 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.202 182627 INFO os_vif [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8')#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.265 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.266 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.266 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:42:51:c2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.267 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:02:7b:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.268 182627 INFO nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Using config drive#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.640 182627 INFO nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Creating config drive at /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.config#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.647 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4f98d9ju execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.775 182627 DEBUG oslo_concurrency.processutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4f98d9ju" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:21 np0005592767 kernel: tap0ed65493-a7: entered promiscuous mode
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.8592] manager: (tap0ed65493-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Jan 22 17:45:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:21Z|00655|binding|INFO|Claiming lport 0ed65493-a756-4d1d-88c3-edf23728b4e9 for this chassis.
Jan 22 17:45:21 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:21Z|00656|binding|INFO|0ed65493-a756-4d1d-88c3-edf23728b4e9: Claiming fa:16:3e:42:51:c2 10.100.0.3
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.907 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.917 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9210] manager: (tapebd087e9-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Jan 22 17:45:21 np0005592767 kernel: tapebd087e9-f8: entered promiscuous mode
Jan 22 17:45:21 np0005592767 nova_compute[182623]: 2026-01-22 22:45:21.925 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9288] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9297] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.930 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:51:c2 10.100.0.3'], port_security=['fa:16:3e:42:51:c2 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bc03363d-d5c8-41bf-b821-89de00c02b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dda117a3-a384-4532-9c43-17e97d829023', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bfcc72a-32e4-4429-8fda-574c467dd87c, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0ed65493-a756-4d1d-88c3-edf23728b4e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.931 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0ed65493-a756-4d1d-88c3-edf23728b4e9 in datapath 7e79faf0-0de6-49d0-a2b3-7134e2a56b40 bound to our chassis#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.933 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e79faf0-0de6-49d0-a2b3-7134e2a56b40#033[00m
Jan 22 17:45:21 np0005592767 systemd-udevd[234686]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:21 np0005592767 systemd-udevd[234687]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.946 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a68ae25f-e1d7-4976-b506-114083b05f3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.948 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e79faf0-01 in ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.952 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e79faf0-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.952 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[efc73fa1-51d4-4c4a-a3dc-4bda885f301a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.953 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b47e3bfe-8230-47d5-8daa-2044e5178cb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9631] device (tap0ed65493-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9642] device (tapebd087e9-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9650] device (tap0ed65493-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:21 np0005592767 NetworkManager[54973]: <info>  [1769121921.9657] device (tapebd087e9-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.965 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c947d0e5-b4d1-424b-8259-93ccc7f6102b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:21 np0005592767 systemd-machined[153912]: New machine qemu-81-instance-00000098.
Jan 22 17:45:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:21.991 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[61b767a2-331e-4796-9c22-2f71856de048]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.024 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ee27d85b-c753-445f-93a9-eee0e170244c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 NetworkManager[54973]: <info>  [1769121922.0426] manager: (tap7e79faf0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.041 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0257dff6-6779-4a1c-9bb2-d3e550904050]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 systemd[1]: Started Virtual Machine qemu-81-instance-00000098.
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.058 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00657|binding|INFO|Claiming lport ebd087e9-f858-41e2-a292-74b1525897a7 for this chassis.
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00658|binding|INFO|ebd087e9-f858-41e2-a292-74b1525897a7: Claiming fa:16:3e:02:7b:69 2001:db8::f816:3eff:fe02:7b69
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00659|binding|INFO|Setting lport 0ed65493-a756-4d1d-88c3-edf23728b4e9 ovn-installed in OVS
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.085 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[943ec373-0706-481d-b8ee-17eb0073ebcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00660|binding|INFO|Setting lport 0ed65493-a756-4d1d-88c3-edf23728b4e9 up in Southbound
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.090 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7b:69 2001:db8::f816:3eff:fe02:7b69'], port_security=['fa:16:3e:02:7b:69 2001:db8::f816:3eff:fe02:7b69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe02:7b69/64', 'neutron:device_id': 'bc03363d-d5c8-41bf-b821-89de00c02b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dda117a3-a384-4532-9c43-17e97d829023', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb971c8-a8f0-488c-9566-d6505e4ced27, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ebd087e9-f858-41e2-a292-74b1525897a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.089 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[cf936b16-9c06-4ead-85d1-ce4279158bae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00661|binding|INFO|Setting lport ebd087e9-f858-41e2-a292-74b1525897a7 ovn-installed in OVS
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00662|binding|INFO|Setting lport ebd087e9-f858-41e2-a292-74b1525897a7 up in Southbound
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 NetworkManager[54973]: <info>  [1769121922.1187] device (tap7e79faf0-00): carrier: link connected
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.123 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d8897f42-f5eb-435b-b775-d154a0c9f204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.138 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1a36e864-a019-4e09-9198-8e76883e1560]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e79faf0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:f9:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551869, 'reachable_time': 16786, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234721, 'error': None, 'target': 'ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.155 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4abdc0ee-387a-4edb-a6ce-219ed9db9211]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:f905'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 551869, 'tstamp': 551869}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234722, 'error': None, 'target': 'ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.175 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[fcffb90c-2a61-4136-8a88-dc9f5ba89a31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e79faf0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:f9:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551869, 'reachable_time': 16786, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234724, 'error': None, 'target': 'ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.212 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7a09245b-a7bf-4a70-b78d-2b817afaf3c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.238 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.275 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8e96d36f-d607-434a-a9c5-ac26d8ca33b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.277 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e79faf0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.277 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.277 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e79faf0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.278 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 NetworkManager[54973]: <info>  [1769121922.2800] manager: (tap7e79faf0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 22 17:45:22 np0005592767 kernel: tap7e79faf0-00: entered promiscuous mode
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.282 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.283 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e79faf0-00, col_values=(('external_ids', {'iface-id': 'b119bda8-e721-47a4-a48a-7986af8a5417'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:22Z|00663|binding|INFO|Releasing lport b119bda8-e721-47a4-a48a-7986af8a5417 from this chassis (sb_readonly=0)
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.297 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.298 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e79faf0-0de6-49d0-a2b3-7134e2a56b40.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e79faf0-0de6-49d0-a2b3-7134e2a56b40.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.299 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d6163f62-ab9d-4212-b86c-3ffe8a5e2ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.300 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-7e79faf0-0de6-49d0-a2b3-7134e2a56b40
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/7e79faf0-0de6-49d0-a2b3-7134e2a56b40.pid.haproxy
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 7e79faf0-0de6-49d0-a2b3-7134e2a56b40
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.301 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'env', 'PROCESS_TAG=haproxy-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e79faf0-0de6-49d0-a2b3-7134e2a56b40.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.394 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121922.394246, bc03363d-d5c8-41bf-b821-89de00c02b83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.395 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] VM Started (Lifecycle Event)#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.418 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.423 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121922.3944366, bc03363d-d5c8-41bf-b821-89de00c02b83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.424 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.448 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.452 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.478 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:45:22 np0005592767 podman[234764]: 2026-01-22 22:45:22.669640916 +0000 UTC m=+0.064021752 container create 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:22 np0005592767 systemd[1]: Started libpod-conmon-23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152.scope.
Jan 22 17:45:22 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:45:22 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cabfda71e69303de957bd2d183661c1b2e82d28e87d73c45fdd8cb4335a26e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:45:22 np0005592767 podman[234764]: 2026-01-22 22:45:22.642677383 +0000 UTC m=+0.037058239 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.743 182627 DEBUG nova.compute.manager [req-64e100db-7dc2-454c-82a9-9802357fc03e req-fe68103d-1aca-484f-815e-44f59b280b1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.744 182627 DEBUG oslo_concurrency.lockutils [req-64e100db-7dc2-454c-82a9-9802357fc03e req-fe68103d-1aca-484f-815e-44f59b280b1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.744 182627 DEBUG oslo_concurrency.lockutils [req-64e100db-7dc2-454c-82a9-9802357fc03e req-fe68103d-1aca-484f-815e-44f59b280b1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.744 182627 DEBUG oslo_concurrency.lockutils [req-64e100db-7dc2-454c-82a9-9802357fc03e req-fe68103d-1aca-484f-815e-44f59b280b1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.745 182627 DEBUG nova.compute.manager [req-64e100db-7dc2-454c-82a9-9802357fc03e req-fe68103d-1aca-484f-815e-44f59b280b1d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Processing event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:45:22 np0005592767 podman[234764]: 2026-01-22 22:45:22.747260982 +0000 UTC m=+0.141641838 container init 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:45:22 np0005592767 podman[234764]: 2026-01-22 22:45:22.753850998 +0000 UTC m=+0.148231834 container start 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:45:22 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [NOTICE]   (234783) : New worker (234785) forked
Jan 22 17:45:22 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [NOTICE]   (234783) : Loading success.
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.823 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ebd087e9-f858-41e2-a292-74b1525897a7 in datapath e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 unbound from our chassis#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.826 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e40f5d78-fe4a-4e89-9d9a-b5048293f4d5#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.838 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2071fb92-f3e1-494b-9278-b6302bec6ed2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.839 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape40f5d78-f1 in ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.842 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape40f5d78-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.842 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fbc856-e899-4813-8ccd-6763bcec036b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.843 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0e77ae33-5161-490f-ae60-66b8600d3465]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.853 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ff5a59-4c81-4822-a456-671892fcac56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.866 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c072a3ea-44a7-4cca-8429-605964316bfe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.893 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b90aa8-f786-498e-a8c1-12e7e7ce233e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.894 182627 DEBUG nova.network.neutron [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updated VIF entry in instance network info cache for port ebd087e9-f858-41e2-a292-74b1525897a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.894 182627 DEBUG nova.network.neutron [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:22 np0005592767 systemd-udevd[234700]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:22 np0005592767 NetworkManager[54973]: <info>  [1769121922.9033] manager: (tape40f5d78-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/305)
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.902 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[aca4dc52-4704-408c-88c3-a41deafcee7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 nova_compute[182623]: 2026-01-22 22:45:22.916 182627 DEBUG oslo_concurrency.lockutils [req-4ec6b15b-05fb-41a1-b034-762d74213a1e req-30915eb6-eec6-47b1-8c7c-fc0519306135 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.935 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fa5f43-2748-4f32-8c06-5d8b13721497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.938 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4e097427-cb6b-4dcd-950c-06cdcd93e848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 NetworkManager[54973]: <info>  [1769121922.9615] device (tape40f5d78-f0): carrier: link connected
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.965 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc893e9-cbb6-4578-a691-384d66f96471]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.981 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c317b70-44ea-4cc0-8020-bb98a75de0fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape40f5d78-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:ba:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551953, 'reachable_time': 16232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234804, 'error': None, 'target': 'ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:22.996 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c88055c5-0b74-4ba6-ae39-f7b5543b0d67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9d:ba8a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 551953, 'tstamp': 551953}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234805, 'error': None, 'target': 'ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.012 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[07ad8e80-ef6c-4a4f-883c-302378ca60ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape40f5d78-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:ba:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551953, 'reachable_time': 16232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234806, 'error': None, 'target': 'ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.046 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e082a072-d53d-46fb-8485-1b809ec29190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.076 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[70739446-61a3-43da-a1c3-74d6512f8825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.077 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape40f5d78-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.078 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.078 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape40f5d78-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:23 np0005592767 NetworkManager[54973]: <info>  [1769121923.1238] manager: (tape40f5d78-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Jan 22 17:45:23 np0005592767 nova_compute[182623]: 2026-01-22 22:45:23.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:23 np0005592767 kernel: tape40f5d78-f0: entered promiscuous mode
Jan 22 17:45:23 np0005592767 nova_compute[182623]: 2026-01-22 22:45:23.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.127 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape40f5d78-f0, col_values=(('external_ids', {'iface-id': '3cbd6e94-c5df-449c-8c4c-dc3faebc9803'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:23 np0005592767 nova_compute[182623]: 2026-01-22 22:45:23.128 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:23 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:23Z|00664|binding|INFO|Releasing lport 3cbd6e94-c5df-449c-8c4c-dc3faebc9803 from this chassis (sb_readonly=0)
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.153 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e40f5d78-fe4a-4e89-9d9a-b5048293f4d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e40f5d78-fe4a-4e89-9d9a-b5048293f4d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:45:23 np0005592767 nova_compute[182623]: 2026-01-22 22:45:23.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.154 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3f989545-ffe5-4a2d-aec4-a28e47066d05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.154 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/e40f5d78-fe4a-4e89-9d9a-b5048293f4d5.pid.haproxy
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID e40f5d78-fe4a-4e89-9d9a-b5048293f4d5
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:45:23 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:23.155 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'env', 'PROCESS_TAG=haproxy-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e40f5d78-fe4a-4e89-9d9a-b5048293f4d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:45:23 np0005592767 podman[234836]: 2026-01-22 22:45:23.494738745 +0000 UTC m=+0.050892051 container create 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:45:23 np0005592767 systemd[1]: Started libpod-conmon-1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb.scope.
Jan 22 17:45:23 np0005592767 podman[234836]: 2026-01-22 22:45:23.466860396 +0000 UTC m=+0.023013712 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:45:23 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:45:23 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/179af8cd5090e623f55ed05099b26d0f899312ad2e8531eb4ffbe84f5589d03c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:45:23 np0005592767 podman[234836]: 2026-01-22 22:45:23.60415997 +0000 UTC m=+0.160313356 container init 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 17:45:23 np0005592767 podman[234836]: 2026-01-22 22:45:23.610767967 +0000 UTC m=+0.166921303 container start 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:45:23 np0005592767 podman[234854]: 2026-01-22 22:45:23.623798366 +0000 UTC m=+0.061920993 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:45:23 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [NOTICE]   (234872) : New worker (234880) forked
Jan 22 17:45:23 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [NOTICE]   (234872) : Loading success.
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.860 182627 DEBUG nova.compute.manager [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.861 182627 DEBUG oslo_concurrency.lockutils [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.861 182627 DEBUG oslo_concurrency.lockutils [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.861 182627 DEBUG oslo_concurrency.lockutils [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.861 182627 DEBUG nova.compute.manager [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No event matching network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 in dict_keys([('network-vif-plugged', '0ed65493-a756-4d1d-88c3-edf23728b4e9')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 22 17:45:24 np0005592767 nova_compute[182623]: 2026-01-22 22:45:24.862 182627 WARNING nova.compute.manager [req-008df5f9-e81c-4153-ba3d-c3024bcd0209 req-252b42c0-85d9-401d-a7d4-1048b774c8cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received unexpected event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:45:25 np0005592767 nova_compute[182623]: 2026-01-22 22:45:25.215 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.005 182627 DEBUG nova.compute.manager [req-d6a48e64-7e1f-4b55-810d-393d33a94cdc req-6259d935-34e8-48f0-acda-3dec5210f01d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.006 182627 DEBUG oslo_concurrency.lockutils [req-d6a48e64-7e1f-4b55-810d-393d33a94cdc req-6259d935-34e8-48f0-acda-3dec5210f01d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.006 182627 DEBUG oslo_concurrency.lockutils [req-d6a48e64-7e1f-4b55-810d-393d33a94cdc req-6259d935-34e8-48f0-acda-3dec5210f01d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.007 182627 DEBUG oslo_concurrency.lockutils [req-d6a48e64-7e1f-4b55-810d-393d33a94cdc req-6259d935-34e8-48f0-acda-3dec5210f01d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.007 182627 DEBUG nova.compute.manager [req-d6a48e64-7e1f-4b55-810d-393d33a94cdc req-6259d935-34e8-48f0-acda-3dec5210f01d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Processing event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.009 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance event wait completed in 3 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.014 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121926.0145264, bc03363d-d5c8-41bf-b821-89de00c02b83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.015 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.017 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.022 182627 INFO nova.virt.libvirt.driver [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance spawned successfully.#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.022 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.044 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.048 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.058 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.059 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.059 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.060 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.060 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.060 182627 DEBUG nova.virt.libvirt.driver [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.066 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.149 182627 INFO nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Took 13.55 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.149 182627 DEBUG nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.250 182627 INFO nova.compute.manager [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Took 14.16 seconds to build instance.#033[00m
Jan 22 17:45:26 np0005592767 nova_compute[182623]: 2026-01-22 22:45:26.268 182627 DEBUG oslo_concurrency.lockutils [None req-9bec20c3-5d66-4c92-a84f-037ed08e3537 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:27 np0005592767 nova_compute[182623]: 2026-01-22 22:45:27.244 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.106 182627 DEBUG nova.compute.manager [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.107 182627 DEBUG oslo_concurrency.lockutils [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.107 182627 DEBUG oslo_concurrency.lockutils [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.107 182627 DEBUG oslo_concurrency.lockutils [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.107 182627 DEBUG nova.compute.manager [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No waiting events found dispatching network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.108 182627 WARNING nova.compute.manager [req-51d42057-062a-44c4-94e9-c0af1f1e4c24 req-7502a583-f878-4ca0-8fbf-3b7650e3e597 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received unexpected event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.299 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:45:28 np0005592767 nova_compute[182623]: 2026-01-22 22:45:28.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:45:29 np0005592767 nova_compute[182623]: 2026-01-22 22:45:29.210 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:29 np0005592767 nova_compute[182623]: 2026-01-22 22:45:29.210 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:29 np0005592767 nova_compute[182623]: 2026-01-22 22:45:29.211 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:45:29 np0005592767 nova_compute[182623]: 2026-01-22 22:45:29.211 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bc03363d-d5c8-41bf-b821-89de00c02b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.064 182627 DEBUG nova.compute.manager [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.065 182627 DEBUG nova.compute.manager [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing instance network info cache due to event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.065 182627 DEBUG oslo_concurrency.lockutils [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.194 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.702 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.751 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.751 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.752 182627 DEBUG oslo_concurrency.lockutils [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.753 182627 DEBUG nova.network.neutron [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing network info cache for port 0ed65493-a756-4d1d-88c3-edf23728b4e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.755 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.756 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.757 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.787 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.788 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.789 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.789 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.880 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.978 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:31 np0005592767 nova_compute[182623]: 2026-01-22 22:45:31.979 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.066 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.244 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.311 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.313 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5486MB free_disk=73.12001419067383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.313 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.313 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.412 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance bc03363d-d5c8-41bf-b821-89de00c02b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.413 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.414 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.472 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.498 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.528 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.529 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.670 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:32 np0005592767 nova_compute[182623]: 2026-01-22 22:45:32.927 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:33 np0005592767 nova_compute[182623]: 2026-01-22 22:45:33.025 182627 DEBUG nova.network.neutron [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updated VIF entry in instance network info cache for port 0ed65493-a756-4d1d-88c3-edf23728b4e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:33 np0005592767 nova_compute[182623]: 2026-01-22 22:45:33.026 182627 DEBUG nova.network.neutron [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:33 np0005592767 nova_compute[182623]: 2026-01-22 22:45:33.043 182627 DEBUG oslo_concurrency.lockutils [req-da58aff1-1d53-4ad5-b3bf-c8b600814edf req-05bdee37-cd68-4671-bc42-3eeba3ebcec2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:34 np0005592767 nova_compute[182623]: 2026-01-22 22:45:34.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:34 np0005592767 nova_compute[182623]: 2026-01-22 22:45:34.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:45:35 np0005592767 podman[234896]: 2026-01-22 22:45:35.192851157 +0000 UTC m=+0.104368553 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:45:35 np0005592767 nova_compute[182623]: 2026-01-22 22:45:35.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:36 np0005592767 nova_compute[182623]: 2026-01-22 22:45:36.199 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:36 np0005592767 nova_compute[182623]: 2026-01-22 22:45:36.492 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:37 np0005592767 nova_compute[182623]: 2026-01-22 22:45:37.247 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:38 np0005592767 nova_compute[182623]: 2026-01-22 22:45:38.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:39Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:51:c2 10.100.0.3
Jan 22 17:45:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:39Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:51:c2 10.100.0.3
Jan 22 17:45:40 np0005592767 podman[234932]: 2026-01-22 22:45:40.259879944 +0000 UTC m=+0.164322619 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:45:40 np0005592767 podman[234933]: 2026-01-22 22:45:40.266782169 +0000 UTC m=+0.141227186 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Jan 22 17:45:41 np0005592767 nova_compute[182623]: 2026-01-22 22:45:41.204 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:41 np0005592767 nova_compute[182623]: 2026-01-22 22:45:41.894 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.086 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.087 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.105 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.246 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.248 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.250 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.263 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.263 182627 INFO nova.compute.claims [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.436 182627 DEBUG nova.compute.provider_tree [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.449 182627 DEBUG nova.scheduler.client.report [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.469 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.470 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.532 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.532 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.547 182627 INFO nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.562 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.665 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.667 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.668 182627 INFO nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Creating image(s)#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.669 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.670 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.672 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.698 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.774 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.776 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.777 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.808 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.883 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.885 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.923 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.926 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.926 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.991 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.992 182627 DEBUG nova.virt.disk.api [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Checking if we can resize image /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:45:42 np0005592767 nova_compute[182623]: 2026-01-22 22:45:42.993 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.011 182627 DEBUG nova.policy [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aeafaa260c0241a2889fa68557afaac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '938a70d81adf4493b2089cf126338d06', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.048 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.049 182627 DEBUG nova.virt.disk.api [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Cannot resize image /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.049 182627 DEBUG nova.objects.instance [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lazy-loading 'migration_context' on Instance uuid e00f1455-458c-45c4-a64c-1b232e5e7f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.064 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.065 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Ensure instance console log exists: /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.065 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.066 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.066 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:43 np0005592767 nova_compute[182623]: 2026-01-22 22:45:43.666 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Successfully created port: 916f60a4-b752-4bdf-be95-24d8336ff086 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:45:44 np0005592767 nova_compute[182623]: 2026-01-22 22:45:44.545 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Successfully updated port: 916f60a4-b752-4bdf-be95-24d8336ff086 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:45:44 np0005592767 nova_compute[182623]: 2026-01-22 22:45:44.570 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:44 np0005592767 nova_compute[182623]: 2026-01-22 22:45:44.570 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquired lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:44 np0005592767 nova_compute[182623]: 2026-01-22 22:45:44.570 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:45:44 np0005592767 nova_compute[182623]: 2026-01-22 22:45:44.750 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.525 182627 DEBUG nova.compute.manager [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-changed-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.525 182627 DEBUG nova.compute.manager [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Refreshing instance network info cache due to event network-changed-916f60a4-b752-4bdf-be95-24d8336ff086. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.526 182627 DEBUG oslo_concurrency.lockutils [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.961 182627 DEBUG nova.network.neutron [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Updating instance_info_cache with network_info: [{"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.979 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Releasing lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.980 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Instance network_info: |[{"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.980 182627 DEBUG oslo_concurrency.lockutils [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.981 182627 DEBUG nova.network.neutron [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Refreshing network info cache for port 916f60a4-b752-4bdf-be95-24d8336ff086 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.984 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Start _get_guest_xml network_info=[{"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.990 182627 WARNING nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.997 182627 DEBUG nova.virt.libvirt.host [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:45:45 np0005592767 nova_compute[182623]: 2026-01-22 22:45:45.998 182627 DEBUG nova.virt.libvirt.host [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.003 182627 DEBUG nova.virt.libvirt.host [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.004 182627 DEBUG nova.virt.libvirt.host [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.005 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.005 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.006 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.006 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.007 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.007 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.007 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.008 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.008 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.009 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.009 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.009 182627 DEBUG nova.virt.hardware [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.014 182627 DEBUG nova.virt.libvirt.vif [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1282826420',id=154,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a70d81adf4493b2089cf126338d06',ramdisk_id='',reservation_id='r-28tqtwpe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:42Z,user_data=None,user_id='aeafaa260c0241a2889fa68557afaac4',uuid=e00f1455-458c-45c4-a64c-1b232e5e7f98,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.015 182627 DEBUG nova.network.os_vif_util [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converting VIF {"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.016 182627 DEBUG nova.network.os_vif_util [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.017 182627 DEBUG nova.objects.instance [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lazy-loading 'pci_devices' on Instance uuid e00f1455-458c-45c4-a64c-1b232e5e7f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.034 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <uuid>e00f1455-458c-45c4-a64c-1b232e5e7f98</uuid>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <name>instance-0000009a</name>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1282826420</nova:name>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:45:45</nova:creationTime>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:user uuid="aeafaa260c0241a2889fa68557afaac4">tempest-ServersNegativeTestMultiTenantJSON-1502681609-project-member</nova:user>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:project uuid="938a70d81adf4493b2089cf126338d06">tempest-ServersNegativeTestMultiTenantJSON-1502681609</nova:project>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        <nova:port uuid="916f60a4-b752-4bdf-be95-24d8336ff086">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="serial">e00f1455-458c-45c4-a64c-1b232e5e7f98</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="uuid">e00f1455-458c-45c4-a64c-1b232e5e7f98</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.config"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:43:33:50"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <target dev="tap916f60a4-b7"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/console.log" append="off"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:45:46 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:45:46 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:45:46 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:45:46 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.036 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Preparing to wait for external event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.037 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.038 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.038 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.039 182627 DEBUG nova.virt.libvirt.vif [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1282826420',id=154,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a70d81adf4493b2089cf126338d06',ramdisk_id='',reservation_id='r-28tqtwpe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:45:42Z,user_data=None,user_id='aeafaa260c0241a2889fa68557afaac4',uuid=e00f1455-458c-45c4-a64c-1b232e5e7f98,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.040 182627 DEBUG nova.network.os_vif_util [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converting VIF {"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.041 182627 DEBUG nova.network.os_vif_util [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.041 182627 DEBUG os_vif [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.042 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.043 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.043 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.049 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.050 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap916f60a4-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.051 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap916f60a4-b7, col_values=(('external_ids', {'iface-id': '916f60a4-b752-4bdf-be95-24d8336ff086', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:33:50', 'vm-uuid': 'e00f1455-458c-45c4-a64c-1b232e5e7f98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:46 np0005592767 NetworkManager[54973]: <info>  [1769121946.0542] manager: (tap916f60a4-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.056 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.064 182627 INFO os_vif [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7')#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.118 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.119 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.120 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] No VIF found with MAC fa:16:3e:43:33:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.121 182627 INFO nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Using config drive#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.899 182627 INFO nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Creating config drive at /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.config#033[00m
Jan 22 17:45:46 np0005592767 nova_compute[182623]: 2026-01-22 22:45:46.910 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5hi_3_4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.055 182627 DEBUG oslo_concurrency.processutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5hi_3_4" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:45:47 np0005592767 kernel: tap916f60a4-b7: entered promiscuous mode
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.1504] manager: (tap916f60a4-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Jan 22 17:45:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:47Z|00665|binding|INFO|Claiming lport 916f60a4-b752-4bdf-be95-24d8336ff086 for this chassis.
Jan 22 17:45:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:47Z|00666|binding|INFO|916f60a4-b752-4bdf-be95-24d8336ff086: Claiming fa:16:3e:43:33:50 10.100.0.13
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.249 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.262 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:33:50 10.100.0.13'], port_security=['fa:16:3e:43:33:50 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e00f1455-458c-45c4-a64c-1b232e5e7f98', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b45fb009-f5f0-48ec-a656-1f230f81978d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '938a70d81adf4493b2089cf126338d06', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd7173193-8bce-4dc9-9c9c-a90d5d8c43a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=231f3f8a-39d6-4ac5-9b7e-cb21bf3112ba, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=916f60a4-b752-4bdf-be95-24d8336ff086) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.263 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 916f60a4-b752-4bdf-be95-24d8336ff086 in datapath b45fb009-f5f0-48ec-a656-1f230f81978d bound to our chassis#033[00m
Jan 22 17:45:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:47Z|00667|binding|INFO|Setting lport 916f60a4-b752-4bdf-be95-24d8336ff086 ovn-installed in OVS
Jan 22 17:45:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:47Z|00668|binding|INFO|Setting lport 916f60a4-b752-4bdf-be95-24d8336ff086 up in Southbound
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.265 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b45fb009-f5f0-48ec-a656-1f230f81978d#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.266 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.281 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[be4356de-e169-4a0d-9587-63b7825aab10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.282 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb45fb009-f1 in ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:45:47 np0005592767 systemd-udevd[235011]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.285 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb45fb009-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.286 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c35cfaba-73ae-4827-a8b1-c8124a4e87cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.286 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[13b85da9-61d1-4242-bcbb-1a465c092f76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 systemd-machined[153912]: New machine qemu-82-instance-0000009a.
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.3078] device (tap916f60a4-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.3103] device (tap916f60a4-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.308 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[12ba8ba9-66b3-4553-9bbe-afde4d57b815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 systemd[1]: Started Virtual Machine qemu-82-instance-0000009a.
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.334 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c933c823-90d2-4e43-9833-006bb1b4eeb4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.386 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[39d4b638-379c-44f3-add7-425ae5b37cac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.393 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e173675a-6894-418c-a56a-8483c19ef10b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 systemd-udevd[235015]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.3958] manager: (tapb45fb009-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/309)
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.436 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[28a33459-9d43-407e-90eb-902ffd28e0a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.441 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0ccb62-07d4-4eb1-83ee-d4d10043cc4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.4747] device (tapb45fb009-f0): carrier: link connected
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.484 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5af5c034-2275-4008-956d-4d88c257872c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.510 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8e794f3a-f952-4329-8f33-d579cb1968ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb45fb009-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:fb:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554405, 'reachable_time': 32880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235045, 'error': None, 'target': 'ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.536 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7c86a3-e36e-47ed-86fe-2ab9309ac21a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:fb58'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554405, 'tstamp': 554405}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235050, 'error': None, 'target': 'ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.555 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6500825f-bc58-48b7-961b-47f6545bc88b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb45fb009-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:fb:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554405, 'reachable_time': 32880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235052, 'error': None, 'target': 'ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.600 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121947.5997634, e00f1455-458c-45c4-a64c-1b232e5e7f98 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.601 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] VM Started (Lifecycle Event)#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.602 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[91dba396-73ef-47ad-9ee2-b7fb9b0e3e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.628 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.633 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121947.6018558, e00f1455-458c-45c4-a64c-1b232e5e7f98 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.634 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.653 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.658 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.680 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.688 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e42a65-27b9-4221-aba7-dddb15851497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.690 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb45fb009-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.690 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.691 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb45fb009-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.693 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:47 np0005592767 kernel: tapb45fb009-f0: entered promiscuous mode
Jan 22 17:45:47 np0005592767 NetworkManager[54973]: <info>  [1769121947.6941] manager: (tapb45fb009-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.700 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb45fb009-f0, col_values=(('external_ids', {'iface-id': '3a489083-3465-4159-b866-3da7224a4533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.702 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:47Z|00669|binding|INFO|Releasing lport 3a489083-3465-4159-b866-3da7224a4533 from this chassis (sb_readonly=0)
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.705 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b45fb009-f5f0-48ec-a656-1f230f81978d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b45fb009-f5f0-48ec-a656-1f230f81978d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.706 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4d15f1ac-cfff-4ebe-86a6-ed783af74103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.708 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-b45fb009-f5f0-48ec-a656-1f230f81978d
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/b45fb009-f5f0-48ec-a656-1f230f81978d.pid.haproxy
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID b45fb009-f5f0-48ec-a656-1f230f81978d
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:45:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:47.709 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d', 'env', 'PROCESS_TAG=haproxy-b45fb009-f5f0-48ec-a656-1f230f81978d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b45fb009-f5f0-48ec-a656-1f230f81978d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.715 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.983 182627 DEBUG nova.compute.manager [req-79584df5-30a8-46a8-bc6e-927d09e74e32 req-210a8aa7-d600-432e-be08-7d847052d58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.983 182627 DEBUG oslo_concurrency.lockutils [req-79584df5-30a8-46a8-bc6e-927d09e74e32 req-210a8aa7-d600-432e-be08-7d847052d58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.984 182627 DEBUG oslo_concurrency.lockutils [req-79584df5-30a8-46a8-bc6e-927d09e74e32 req-210a8aa7-d600-432e-be08-7d847052d58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.984 182627 DEBUG oslo_concurrency.lockutils [req-79584df5-30a8-46a8-bc6e-927d09e74e32 req-210a8aa7-d600-432e-be08-7d847052d58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.984 182627 DEBUG nova.compute.manager [req-79584df5-30a8-46a8-bc6e-927d09e74e32 req-210a8aa7-d600-432e-be08-7d847052d58f 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Processing event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.985 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.988 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769121947.9880512, e00f1455-458c-45c4-a64c-1b232e5e7f98 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.988 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.990 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.993 182627 INFO nova.virt.libvirt.driver [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Instance spawned successfully.#033[00m
Jan 22 17:45:47 np0005592767 nova_compute[182623]: 2026-01-22 22:45:47.993 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.011 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:48.015 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.017 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.020 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.020 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.020 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.021 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.021 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.022 182627 DEBUG nova.virt.libvirt.driver [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.025 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.054 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.110 182627 INFO nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Took 5.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.111 182627 DEBUG nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.130 182627 DEBUG nova.network.neutron [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Updated VIF entry in instance network info cache for port 916f60a4-b752-4bdf-be95-24d8336ff086. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.130 182627 DEBUG nova.network.neutron [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Updating instance_info_cache with network_info: [{"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.148 182627 DEBUG oslo_concurrency.lockutils [req-271fde3b-866d-4f32-86f4-490e1fb5c301 req-3e14f173-b372-4cec-b4dd-6fbf69f980c7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-e00f1455-458c-45c4-a64c-1b232e5e7f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.199 182627 INFO nova.compute.manager [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Took 6.01 seconds to build instance.#033[00m
Jan 22 17:45:48 np0005592767 nova_compute[182623]: 2026-01-22 22:45:48.213 182627 DEBUG oslo_concurrency.lockutils [None req-a80571ca-5702-42b7-b301-0340d9ce1d9e aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:48 np0005592767 podman[235085]: 2026-01-22 22:45:48.220654323 +0000 UTC m=+0.070296499 container create 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:45:48 np0005592767 systemd[1]: Started libpod-conmon-1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6.scope.
Jan 22 17:45:48 np0005592767 podman[235085]: 2026-01-22 22:45:48.19613668 +0000 UTC m=+0.045778876 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:45:48 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:45:48 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e201861ebb506df93593080a9b6909860258e18b3933d710643af108d25c3f39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:45:48 np0005592767 podman[235098]: 2026-01-22 22:45:48.323886614 +0000 UTC m=+0.065431133 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:45:48 np0005592767 podman[235085]: 2026-01-22 22:45:48.324760098 +0000 UTC m=+0.174402264 container init 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:48 np0005592767 podman[235085]: 2026-01-22 22:45:48.334435832 +0000 UTC m=+0.184078008 container start 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 17:45:48 np0005592767 podman[235097]: 2026-01-22 22:45:48.346628017 +0000 UTC m=+0.091535871 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:48 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [NOTICE]   (235141) : New worker (235143) forked
Jan 22 17:45:48 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [NOTICE]   (235141) : Loading success.
Jan 22 17:45:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:48.394 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.863 182627 DEBUG nova.compute.manager [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.863 182627 DEBUG nova.compute.manager [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing instance network info cache due to event network-changed-0ed65493-a756-4d1d-88c3-edf23728b4e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.863 182627 DEBUG oslo_concurrency.lockutils [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.863 182627 DEBUG oslo_concurrency.lockutils [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.864 182627 DEBUG nova.network.neutron [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Refreshing network info cache for port 0ed65493-a756-4d1d-88c3-edf23728b4e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.939 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.939 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.940 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.940 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.940 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.952 182627 INFO nova.compute.manager [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Terminating instance#033[00m
Jan 22 17:45:49 np0005592767 nova_compute[182623]: 2026-01-22 22:45:49.963 182627 DEBUG nova.compute.manager [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:45:49 np0005592767 kernel: tap0ed65493-a7 (unregistering): left promiscuous mode
Jan 22 17:45:49 np0005592767 NetworkManager[54973]: <info>  [1769121949.9907] device (tap0ed65493-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00670|binding|INFO|Releasing lport 0ed65493-a756-4d1d-88c3-edf23728b4e9 from this chassis (sb_readonly=0)
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00671|binding|INFO|Setting lport 0ed65493-a756-4d1d-88c3-edf23728b4e9 down in Southbound
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00672|binding|INFO|Removing iface tap0ed65493-a7 ovn-installed in OVS
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.002 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.007 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:51:c2 10.100.0.3'], port_security=['fa:16:3e:42:51:c2 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bc03363d-d5c8-41bf-b821-89de00c02b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dda117a3-a384-4532-9c43-17e97d829023', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bfcc72a-32e4-4429-8fda-574c467dd87c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0ed65493-a756-4d1d-88c3-edf23728b4e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.008 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0ed65493-a756-4d1d-88c3-edf23728b4e9 in datapath 7e79faf0-0de6-49d0-a2b3-7134e2a56b40 unbound from our chassis#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.010 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e79faf0-0de6-49d0-a2b3-7134e2a56b40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.011 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7126c633-d70f-4ad4-8178-fef1c8987279]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.011 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40 namespace which is not needed anymore#033[00m
Jan 22 17:45:50 np0005592767 kernel: tapebd087e9-f8 (unregistering): left promiscuous mode
Jan 22 17:45:50 np0005592767 NetworkManager[54973]: <info>  [1769121950.0186] device (tapebd087e9-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00673|binding|INFO|Releasing lport ebd087e9-f858-41e2-a292-74b1525897a7 from this chassis (sb_readonly=0)
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00674|binding|INFO|Setting lport ebd087e9-f858-41e2-a292-74b1525897a7 down in Southbound
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:50Z|00675|binding|INFO|Removing iface tapebd087e9-f8 ovn-installed in OVS
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.033 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7b:69 2001:db8::f816:3eff:fe02:7b69'], port_security=['fa:16:3e:02:7b:69 2001:db8::f816:3eff:fe02:7b69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe02:7b69/64', 'neutron:device_id': 'bc03363d-d5c8-41bf-b821-89de00c02b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dda117a3-a384-4532-9c43-17e97d829023', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb971c8-a8f0-488c-9566-d6505e4ced27, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ebd087e9-f858-41e2-a292-74b1525897a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 22 17:45:50 np0005592767 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000098.scope: Consumed 13.935s CPU time.
Jan 22 17:45:50 np0005592767 systemd-machined[153912]: Machine qemu-81-instance-00000098 terminated.
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.093 182627 DEBUG nova.compute.manager [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.093 182627 DEBUG oslo_concurrency.lockutils [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.093 182627 DEBUG oslo_concurrency.lockutils [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.094 182627 DEBUG oslo_concurrency.lockutils [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.094 182627 DEBUG nova.compute.manager [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] No waiting events found dispatching network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.094 182627 WARNING nova.compute.manager [req-25059e72-d90b-437c-be0a-c865678a2591 req-49f563cf-b652-4712-8f8e-390d9cd6a44b 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received unexpected event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [NOTICE]   (234783) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [NOTICE]   (234783) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [WARNING]  (234783) : Exiting Master process...
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [ALERT]    (234783) : Current worker (234785) exited with code 143 (Terminated)
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40[234779]: [WARNING]  (234783) : All workers exited. Exiting... (0)
Jan 22 17:45:50 np0005592767 systemd[1]: libpod-23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152.scope: Deactivated successfully.
Jan 22 17:45:50 np0005592767 podman[235179]: 2026-01-22 22:45:50.150843282 +0000 UTC m=+0.047690440 container died 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay-5cabfda71e69303de957bd2d183661c1b2e82d28e87d73c45fdd8cb4335a26e4-merged.mount: Deactivated successfully.
Jan 22 17:45:50 np0005592767 podman[235179]: 2026-01-22 22:45:50.188237629 +0000 UTC m=+0.085084757 container cleanup 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:45:50 np0005592767 systemd[1]: libpod-conmon-23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152.scope: Deactivated successfully.
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.245 182627 INFO nova.virt.libvirt.driver [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Instance destroyed successfully.#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.246 182627 DEBUG nova.objects.instance [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid bc03363d-d5c8-41bf-b821-89de00c02b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.249 182627 DEBUG nova.compute.manager [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-unplugged-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.250 182627 DEBUG oslo_concurrency.lockutils [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.250 182627 DEBUG oslo_concurrency.lockutils [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.250 182627 DEBUG oslo_concurrency.lockutils [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.250 182627 DEBUG nova.compute.manager [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No waiting events found dispatching network-vif-unplugged-ebd087e9-f858-41e2-a292-74b1525897a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.250 182627 DEBUG nova.compute.manager [req-60e0b6d8-cbbe-4902-a84c-9594176270b7 req-97e7201e-4ada-4546-8bb7-f160969745b1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-unplugged-ebd087e9-f858-41e2-a292-74b1525897a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.259 182627 DEBUG nova.virt.libvirt.vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:45:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.259 182627 DEBUG nova.network.os_vif_util [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.260 182627 DEBUG nova.network.os_vif_util [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.260 182627 DEBUG os_vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.262 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.262 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ed65493-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:50 np0005592767 podman[235214]: 2026-01-22 22:45:50.263816637 +0000 UTC m=+0.048214944 container remove 23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.266 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.269 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.271 182627 INFO os_vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:51:c2,bridge_name='br-int',has_traffic_filtering=True,id=0ed65493-a756-4d1d-88c3-edf23728b4e9,network=Network(7e79faf0-0de6-49d0-a2b3-7134e2a56b40),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed65493-a7')#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.272 182627 DEBUG nova.virt.libvirt.vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-963547908',display_name='tempest-TestGettingAddress-server-963547908',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-963547908',id=152,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFA/xLR7uCniKKN3J3MvDuDgq1k2nXD6/IVnDx/Ggw7OKIgvP5wWB27A9uTX9C51/qnV7tsqKWD6+hQspX1pNm6zhYxiNvuQC78wZdTIpaMOM4mz1wMfreEOXQ6GZpFSQ==',key_name='tempest-TestGettingAddress-2019823065',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:45:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-q2zoca35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bc03363d-d5c8-41bf-b821-89de00c02b83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.271 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[40fcf1b7-a34e-4ed7-9dbc-8a09ba39b407]: (4, ('Thu Jan 22 10:45:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40 (23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152)\n23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152\nThu Jan 22 10:45:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40 (23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152)\n23605033eb01fa4b179574978e3674ee11d616b5a2318dc3a64a45784a881152\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.272 182627 DEBUG nova.network.os_vif_util [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.273 182627 DEBUG nova.network.os_vif_util [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.273 182627 DEBUG os_vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.274 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.274 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebd087e9-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.273 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fddab6-b1a0-45c4-acf0-a90277eef4b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.274 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e79faf0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:50 np0005592767 kernel: tap7e79faf0-00: left promiscuous mode
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.280 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[83345208-ea71-40b8-b71c-dc456512b4c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.293 182627 INFO os_vif [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7b:69,bridge_name='br-int',has_traffic_filtering=True,id=ebd087e9-f858-41e2-a292-74b1525897a7,network=Network(e40f5d78-fe4a-4e89-9d9a-b5048293f4d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebd087e9-f8')#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.294 182627 INFO nova.virt.libvirt.driver [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Deleting instance files /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83_del#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.295 182627 INFO nova.virt.libvirt.driver [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Deletion of /var/lib/nova/instances/bc03363d-d5c8-41bf-b821-89de00c02b83_del complete#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.300 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.302 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[73b0a39a-d124-49d8-bee6-0918145d77fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.303 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1125cbfc-d415-49d6-a89b-0a55c032a04f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.320 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c9e9c8-5d16-4ac8-a495-38321a8cbfd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551859, 'reachable_time': 24875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235255, 'error': None, 'target': 'ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.322 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e79faf0-0de6-49d0-a2b3-7134e2a56b40 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.322 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[72acffdf-ef19-4a38-a397-27510ed54f9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.324 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ebd087e9-f858-41e2-a292-74b1525897a7 in datapath e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 unbound from our chassis#033[00m
Jan 22 17:45:50 np0005592767 systemd[1]: run-netns-ovnmeta\x2d7e79faf0\x2d0de6\x2d49d0\x2da2b3\x2d7134e2a56b40.mount: Deactivated successfully.
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.325 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.326 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[925504e8-ea77-4cde-8cd2-4fa187de8d64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.327 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 namespace which is not needed anymore#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.382 182627 INFO nova.compute.manager [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.383 182627 DEBUG oslo.service.loopingcall [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.384 182627 DEBUG nova.compute.manager [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.385 182627 DEBUG nova.network.neutron [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [NOTICE]   (234872) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [NOTICE]   (234872) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [WARNING]  (234872) : Exiting Master process...
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [WARNING]  (234872) : Exiting Master process...
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [ALERT]    (234872) : Current worker (234880) exited with code 143 (Terminated)
Jan 22 17:45:50 np0005592767 neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5[234852]: [WARNING]  (234872) : All workers exited. Exiting... (0)
Jan 22 17:45:50 np0005592767 systemd[1]: libpod-1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb.scope: Deactivated successfully.
Jan 22 17:45:50 np0005592767 podman[235273]: 2026-01-22 22:45:50.506596455 +0000 UTC m=+0.061277315 container died 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:45:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay-179af8cd5090e623f55ed05099b26d0f899312ad2e8531eb4ffbe84f5589d03c-merged.mount: Deactivated successfully.
Jan 22 17:45:50 np0005592767 podman[235273]: 2026-01-22 22:45:50.5520197 +0000 UTC m=+0.106700570 container cleanup 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 17:45:50 np0005592767 systemd[1]: libpod-conmon-1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb.scope: Deactivated successfully.
Jan 22 17:45:50 np0005592767 podman[235302]: 2026-01-22 22:45:50.644848165 +0000 UTC m=+0.063511707 container remove 1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.650 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[05aa5334-3b94-4cd9-a98c-60e96844ba03]: (4, ('Thu Jan 22 10:45:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 (1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb)\n1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb\nThu Jan 22 10:45:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 (1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb)\n1a653dd65cfe5074439facfe26b6cc9a52481fd38814d6bb058d783dbb8eaceb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.652 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[50cbd73d-ef3b-45eb-854e-6b26cb4abf0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.653 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape40f5d78-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 kernel: tape40f5d78-f0: left promiscuous mode
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.659 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4dddee94-ba86-4a48-9319-36a02136d423]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 nova_compute[182623]: 2026-01-22 22:45:50.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.720 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[21fc3242-5f70-4821-966a-942510cafec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.721 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c536cbc9-7786-4d60-aa25-1e716c03408e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.736 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c184013-ab56-48c3-8eda-3187f3116ded]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551946, 'reachable_time': 31277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235318, 'error': None, 'target': 'ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.738 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e40f5d78-fe4a-4e89-9d9a-b5048293f4d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:50.738 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[09c5634f-37a9-481e-a01a-d5740d37cb4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 systemd[1]: run-netns-ovnmeta\x2de40f5d78\x2dfe4a\x2d4e89\x2d9d9a\x2db5048293f4d5.mount: Deactivated successfully.
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.252 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.253 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.253 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.253 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.253 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.266 182627 INFO nova.compute.manager [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Terminating instance#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.276 182627 DEBUG nova.compute.manager [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:45:51 np0005592767 kernel: tap916f60a4-b7 (unregistering): left promiscuous mode
Jan 22 17:45:51 np0005592767 NetworkManager[54973]: <info>  [1769121951.2977] device (tap916f60a4-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:51Z|00676|binding|INFO|Releasing lport 916f60a4-b752-4bdf-be95-24d8336ff086 from this chassis (sb_readonly=0)
Jan 22 17:45:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:51Z|00677|binding|INFO|Setting lport 916f60a4-b752-4bdf-be95-24d8336ff086 down in Southbound
Jan 22 17:45:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:45:51Z|00678|binding|INFO|Removing iface tap916f60a4-b7 ovn-installed in OVS
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.305 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.327 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:33:50 10.100.0.13'], port_security=['fa:16:3e:43:33:50 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e00f1455-458c-45c4-a64c-1b232e5e7f98', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b45fb009-f5f0-48ec-a656-1f230f81978d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '938a70d81adf4493b2089cf126338d06', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd7173193-8bce-4dc9-9c9c-a90d5d8c43a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=231f3f8a-39d6-4ac5-9b7e-cb21bf3112ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=916f60a4-b752-4bdf-be95-24d8336ff086) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.329 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 916f60a4-b752-4bdf-be95-24d8336ff086 in datapath b45fb009-f5f0-48ec-a656-1f230f81978d unbound from our chassis#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.332 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.334 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b45fb009-f5f0-48ec-a656-1f230f81978d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.336 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e68b44-c605-4fc7-add4-76a24d1a6ddd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.337 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d namespace which is not needed anymore#033[00m
Jan 22 17:45:51 np0005592767 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Jan 22 17:45:51 np0005592767 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009a.scope: Consumed 3.429s CPU time.
Jan 22 17:45:51 np0005592767 systemd-machined[153912]: Machine qemu-82-instance-0000009a terminated.
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.487 182627 DEBUG nova.network.neutron [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updated VIF entry in instance network info cache for port 0ed65493-a756-4d1d-88c3-edf23728b4e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.488 182627 DEBUG nova.network.neutron [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [{"id": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "address": "fa:16:3e:42:51:c2", "network": {"id": "7e79faf0-0de6-49d0-a2b3-7134e2a56b40", "bridge": "br-int", "label": "tempest-network-smoke--2062718683", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed65493-a7", "ovs_interfaceid": "0ed65493-a756-4d1d-88c3-edf23728b4e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ebd087e9-f858-41e2-a292-74b1525897a7", "address": "fa:16:3e:02:7b:69", "network": {"id": "e40f5d78-fe4a-4e89-9d9a-b5048293f4d5", "bridge": "br-int", "label": "tempest-network-smoke--287522423", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe02:7b69", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebd087e9-f8", "ovs_interfaceid": "ebd087e9-f858-41e2-a292-74b1525897a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [NOTICE]   (235141) : haproxy version is 2.8.14-c23fe91
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [NOTICE]   (235141) : path to executable is /usr/sbin/haproxy
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [WARNING]  (235141) : Exiting Master process...
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [WARNING]  (235141) : Exiting Master process...
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [ALERT]    (235141) : Current worker (235143) exited with code 143 (Terminated)
Jan 22 17:45:51 np0005592767 neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d[235118]: [WARNING]  (235141) : All workers exited. Exiting... (0)
Jan 22 17:45:51 np0005592767 systemd[1]: libpod-1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6.scope: Deactivated successfully.
Jan 22 17:45:51 np0005592767 podman[235338]: 2026-01-22 22:45:51.500129387 +0000 UTC m=+0.057470677 container died 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:45:51 np0005592767 NetworkManager[54973]: <info>  [1769121951.5015] manager: (tap916f60a4-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/311)
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.519 182627 DEBUG oslo_concurrency.lockutils [req-a99d3bb9-7d64-4fed-9c31-6918bdfc5f1a req-c4635250-6363-4eed-a0ea-14619432a57e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bc03363d-d5c8-41bf-b821-89de00c02b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:45:51 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6-userdata-shm.mount: Deactivated successfully.
Jan 22 17:45:51 np0005592767 systemd[1]: var-lib-containers-storage-overlay-e201861ebb506df93593080a9b6909860258e18b3933d710643af108d25c3f39-merged.mount: Deactivated successfully.
Jan 22 17:45:51 np0005592767 podman[235338]: 2026-01-22 22:45:51.53630335 +0000 UTC m=+0.093644690 container cleanup 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:45:51 np0005592767 systemd[1]: libpod-conmon-1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6.scope: Deactivated successfully.
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.549 182627 INFO nova.virt.libvirt.driver [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Instance destroyed successfully.#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.550 182627 DEBUG nova.objects.instance [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lazy-loading 'resources' on Instance uuid e00f1455-458c-45c4-a64c-1b232e5e7f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.564 182627 DEBUG nova.virt.libvirt.vif [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1282826420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1282826420',id=154,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:45:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='938a70d81adf4493b2089cf126338d06',ramdisk_id='',reservation_id='r-28tqtwpe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1502681609-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:45:48Z,user_data=None,user_id='aeafaa260c0241a2889fa68557afaac4',uuid=e00f1455-458c-45c4-a64c-1b232e5e7f98,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.564 182627 DEBUG nova.network.os_vif_util [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converting VIF {"id": "916f60a4-b752-4bdf-be95-24d8336ff086", "address": "fa:16:3e:43:33:50", "network": {"id": "b45fb009-f5f0-48ec-a656-1f230f81978d", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1739646034-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "938a70d81adf4493b2089cf126338d06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap916f60a4-b7", "ovs_interfaceid": "916f60a4-b752-4bdf-be95-24d8336ff086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.565 182627 DEBUG nova.network.os_vif_util [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.565 182627 DEBUG os_vif [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.567 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.567 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap916f60a4-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.570 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.571 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.573 182627 INFO os_vif [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:33:50,bridge_name='br-int',has_traffic_filtering=True,id=916f60a4-b752-4bdf-be95-24d8336ff086,network=Network(b45fb009-f5f0-48ec-a656-1f230f81978d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap916f60a4-b7')#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.574 182627 INFO nova.virt.libvirt.driver [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Deleting instance files /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98_del#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.574 182627 INFO nova.virt.libvirt.driver [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Deletion of /var/lib/nova/instances/e00f1455-458c-45c4-a64c-1b232e5e7f98_del complete#033[00m
Jan 22 17:45:51 np0005592767 podman[235385]: 2026-01-22 22:45:51.599061715 +0000 UTC m=+0.039959221 container remove 1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.604 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0cdb712b-28fb-41f0-beca-5f26b747858f]: (4, ('Thu Jan 22 10:45:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d (1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6)\n1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6\nThu Jan 22 10:45:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d (1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6)\n1f2ba447936ec65bd2cbd0c7bf750875f3baeb71f5deaf58f6ad714de58c9bc6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.605 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bd92c891-f3e6-4812-a6e5-d0aacce33d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.606 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb45fb009-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.607 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 kernel: tapb45fb009-f0: left promiscuous mode
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.630 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.633 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dc23d82d-d356-4c88-a1f5-3018902793e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.648 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[24f831ff-f47e-4a1c-89da-65cf44415871]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.650 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[077809ce-3967-42c3-b96e-cc9fdf433931]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.665 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a7f65c-a5c0-45c2-a142-127249801a81]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554395, 'reachable_time': 22429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235401, 'error': None, 'target': 'ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.667 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b45fb009-f5f0-48ec-a656-1f230f81978d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:45:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:51.667 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[08f6d6c1-5e1c-44f0-9a0e-470ca75ead73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:45:51 np0005592767 systemd[1]: run-netns-ovnmeta\x2db45fb009\x2df5f0\x2d48ec\x2da656\x2d1f230f81978d.mount: Deactivated successfully.
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.673 182627 INFO nova.compute.manager [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.674 182627 DEBUG oslo.service.loopingcall [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.674 182627 DEBUG nova.compute.manager [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:45:51 np0005592767 nova_compute[182623]: 2026-01-22 22:45:51.674 182627 DEBUG nova.network.neutron [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.036 182627 DEBUG nova.network.neutron [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.056 182627 INFO nova.compute.manager [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.143 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.143 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.194 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-unplugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.195 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.195 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.195 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.196 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No waiting events found dispatching network-vif-unplugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.196 182627 WARNING nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received unexpected event network-vif-unplugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.196 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.196 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.197 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.197 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.197 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No waiting events found dispatching network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.197 182627 WARNING nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received unexpected event network-vif-plugged-0ed65493-a756-4d1d-88c3-edf23728b4e9 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.197 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-deleted-0ed65493-a756-4d1d-88c3-edf23728b4e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.198 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-unplugged-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.198 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.198 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.198 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.199 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] No waiting events found dispatching network-vif-unplugged-916f60a4-b752-4bdf-be95-24d8336ff086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.199 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-unplugged-916f60a4-b752-4bdf-be95-24d8336ff086 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.199 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.200 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.200 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.200 182627 DEBUG oslo_concurrency.lockutils [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.200 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] No waiting events found dispatching network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.200 182627 WARNING nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received unexpected event network-vif-plugged-916f60a4-b752-4bdf-be95-24d8336ff086 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.201 182627 DEBUG nova.compute.manager [req-ce482412-e26c-4741-99ce-621f5f3c3c11 req-9f07e448-5b41-4ba6-bca1-cee0b698e2e9 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-deleted-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.221 182627 DEBUG nova.compute.provider_tree [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.239 182627 DEBUG nova.scheduler.client.report [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.258 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.267 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.286 182627 INFO nova.scheduler.client.report [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance bc03363d-d5c8-41bf-b821-89de00c02b83#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.343 182627 DEBUG nova.compute.manager [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.344 182627 DEBUG oslo_concurrency.lockutils [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.344 182627 DEBUG oslo_concurrency.lockutils [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.344 182627 DEBUG oslo_concurrency.lockutils [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.344 182627 DEBUG nova.compute.manager [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] No waiting events found dispatching network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.344 182627 WARNING nova.compute.manager [req-1ca34c63-b6f9-490e-8b69-a93916b72d02 req-7c1612e9-5dd4-4e23-bcfe-95bc2bc12fea 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Received unexpected event network-vif-plugged-ebd087e9-f858-41e2-a292-74b1525897a7 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.364 182627 DEBUG oslo_concurrency.lockutils [None req-e0b1fca3-e3ce-41fc-8dd5-ec149bcb7bcd 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bc03363d-d5c8-41bf-b821-89de00c02b83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.533 182627 DEBUG nova.network.neutron [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.549 182627 INFO nova.compute.manager [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.613 182627 DEBUG nova.compute.manager [req-128327cf-86b8-46f2-b544-220cec1dafd4 req-598533ca-f71d-4a7e-8f26-b716c1c713fd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Received event network-vif-deleted-916f60a4-b752-4bdf-be95-24d8336ff086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.614 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.615 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.660 182627 DEBUG nova.compute.provider_tree [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.672 182627 DEBUG nova.scheduler.client.report [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.686 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.708 182627 INFO nova.scheduler.client.report [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Deleted allocations for instance e00f1455-458c-45c4-a64c-1b232e5e7f98#033[00m
Jan 22 17:45:52 np0005592767 nova_compute[182623]: 2026-01-22 22:45:52.815 182627 DEBUG oslo_concurrency.lockutils [None req-59bf4d31-5fc3-4f2e-ad5d-5ce0e61584b0 aeafaa260c0241a2889fa68557afaac4 938a70d81adf4493b2089cf126338d06 - - default default] Lock "e00f1455-458c-45c4-a64c-1b232e5e7f98" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:45:54 np0005592767 podman[235402]: 2026-01-22 22:45:54.154233062 +0000 UTC m=+0.066764349 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:45:56 np0005592767 nova_compute[182623]: 2026-01-22 22:45:56.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:56 np0005592767 nova_compute[182623]: 2026-01-22 22:45:56.352 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:45:56.396 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:45:56 np0005592767 nova_compute[182623]: 2026-01-22 22:45:56.569 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:45:57 np0005592767 nova_compute[182623]: 2026-01-22 22:45:57.269 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:01 np0005592767 nova_compute[182623]: 2026-01-22 22:46:01.572 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:02 np0005592767 nova_compute[182623]: 2026-01-22 22:46:02.299 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:05 np0005592767 nova_compute[182623]: 2026-01-22 22:46:05.243 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121950.2425644, bc03363d-d5c8-41bf-b821-89de00c02b83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:05 np0005592767 nova_compute[182623]: 2026-01-22 22:46:05.244 182627 INFO nova.compute.manager [-] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:46:05 np0005592767 nova_compute[182623]: 2026-01-22 22:46:05.264 182627 DEBUG nova.compute.manager [None req-ff588ca2-8108-4263-a480-6cac630338bc - - - - - -] [instance: bc03363d-d5c8-41bf-b821-89de00c02b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:06 np0005592767 podman[235428]: 2026-01-22 22:46:06.149269292 +0000 UTC m=+0.067404167 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 22 17:46:06 np0005592767 nova_compute[182623]: 2026-01-22 22:46:06.548 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769121951.547151, e00f1455-458c-45c4-a64c-1b232e5e7f98 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:06 np0005592767 nova_compute[182623]: 2026-01-22 22:46:06.548 182627 INFO nova.compute.manager [-] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:46:06 np0005592767 nova_compute[182623]: 2026-01-22 22:46:06.564 182627 DEBUG nova.compute.manager [None req-e690c30f-236b-4288-9025-b348cbf5e657 - - - - - -] [instance: e00f1455-458c-45c4-a64c-1b232e5e7f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:06 np0005592767 nova_compute[182623]: 2026-01-22 22:46:06.576 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:07 np0005592767 nova_compute[182623]: 2026-01-22 22:46:07.300 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:46:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:46:11 np0005592767 podman[235450]: 2026-01-22 22:46:11.140812632 +0000 UTC m=+0.053709640 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 22 17:46:11 np0005592767 podman[235449]: 2026-01-22 22:46:11.173302841 +0000 UTC m=+0.092336472 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:46:11 np0005592767 nova_compute[182623]: 2026-01-22 22:46:11.701 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:12.117 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:12 np0005592767 nova_compute[182623]: 2026-01-22 22:46:12.304 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:16 np0005592767 nova_compute[182623]: 2026-01-22 22:46:16.704 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:17 np0005592767 nova_compute[182623]: 2026-01-22 22:46:17.306 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:19 np0005592767 podman[235497]: 2026-01-22 22:46:19.166720193 +0000 UTC m=+0.086372514 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:46:19 np0005592767 podman[235498]: 2026-01-22 22:46:19.173516865 +0000 UTC m=+0.081689732 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:46:21 np0005592767 nova_compute[182623]: 2026-01-22 22:46:21.707 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:22 np0005592767 nova_compute[182623]: 2026-01-22 22:46:22.309 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:25 np0005592767 podman[235541]: 2026-01-22 22:46:25.167157171 +0000 UTC m=+0.087601459 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:46:26 np0005592767 nova_compute[182623]: 2026-01-22 22:46:26.711 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:27 np0005592767 nova_compute[182623]: 2026-01-22 22:46:27.354 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:28 np0005592767 nova_compute[182623]: 2026-01-22 22:46:28.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:28 np0005592767 nova_compute[182623]: 2026-01-22 22:46:28.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:46:28 np0005592767 nova_compute[182623]: 2026-01-22 22:46:28.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.332 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.332 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.332 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.333 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.927 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.927 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:31 np0005592767 nova_compute[182623]: 2026-01-22 22:46:31.927 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.102 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.104 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5688MB free_disk=73.12092208862305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.104 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.104 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.179 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.179 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.223 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.238 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.238 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.261 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.288 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.328 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.348 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.388 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.389 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:32 np0005592767 nova_compute[182623]: 2026-01-22 22:46:32.390 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.537 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.538 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.554 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.661 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.661 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.667 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.668 182627 INFO nova.compute.claims [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.891 182627 DEBUG nova.compute.provider_tree [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.913 182627 DEBUG nova.scheduler.client.report [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.945 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:33 np0005592767 nova_compute[182623]: 2026-01-22 22:46:33.947 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.008 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.008 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.039 182627 INFO nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.065 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.216 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.218 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.218 182627 INFO nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Creating image(s)#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.219 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.219 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.220 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.231 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.304 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.306 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.306 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.316 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.370 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.371 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:34.385 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:57:9d 2001:db8:0:1:f816:3eff:fe60:579d 2001:db8::f816:3eff:fe60:579d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe60:579d/64 2001:db8::f816:3eff:fe60:579d/64', 'neutron:device_id': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f20a608b-4dde-4090-8331-5a96db0eeb25) old=Port_Binding(mac=['fa:16:3e:60:57:9d 2001:db8::f816:3eff:fe60:579d'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe60:579d/64', 'neutron:device_id': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:34.386 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f20a608b-4dde-4090-8331-5a96db0eeb25 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 updated#033[00m
Jan 22 17:46:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:34.387 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09b515c7-d044-43d4-b895-408eb5de1fd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.389 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:34 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:34.389 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e48ffa0c-3779-4676-bd59-b63e1672c457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.405 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.405 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.406 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.424 182627 DEBUG nova.policy [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12468cd86a594cc5ba37213d454f45c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63313e52f3864087904d9eb367b6597c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.461 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.462 182627 DEBUG nova.virt.disk.api [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Checking if we can resize image /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.463 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.520 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.521 182627 DEBUG nova.virt.disk.api [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Cannot resize image /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.522 182627 DEBUG nova.objects.instance [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'migration_context' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.537 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.538 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Ensure instance console log exists: /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.538 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.539 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:34 np0005592767 nova_compute[182623]: 2026-01-22 22:46:34.539 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:35 np0005592767 nova_compute[182623]: 2026-01-22 22:46:35.591 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Successfully created port: 0e0ff891-b3da-4f36-87e5-d9e183dd922a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:46:36 np0005592767 nova_compute[182623]: 2026-01-22 22:46:36.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:36 np0005592767 nova_compute[182623]: 2026-01-22 22:46:36.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:36 np0005592767 nova_compute[182623]: 2026-01-22 22:46:36.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:36 np0005592767 nova_compute[182623]: 2026-01-22 22:46:36.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.076 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Successfully updated port: 0e0ff891-b3da-4f36-87e5-d9e183dd922a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.100 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.101 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquired lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.102 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:46:37 np0005592767 podman[235582]: 2026-01-22 22:46:37.186298075 +0000 UTC m=+0.099171876 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.257 182627 DEBUG nova.compute.manager [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-changed-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.258 182627 DEBUG nova.compute.manager [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Refreshing instance network info cache due to event network-changed-0e0ff891-b3da-4f36-87e5-d9e183dd922a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.259 182627 DEBUG oslo_concurrency.lockutils [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.304 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:46:37 np0005592767 nova_compute[182623]: 2026-01-22 22:46:37.434 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.844 182627 DEBUG nova.network.neutron [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updating instance_info_cache with network_info: [{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.878 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Releasing lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.878 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance network_info: |[{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.878 182627 DEBUG oslo_concurrency.lockutils [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.879 182627 DEBUG nova.network.neutron [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Refreshing network info cache for port 0e0ff891-b3da-4f36-87e5-d9e183dd922a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.881 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Start _get_guest_xml network_info=[{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.887 182627 WARNING nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.894 182627 DEBUG nova.virt.libvirt.host [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.895 182627 DEBUG nova.virt.libvirt.host [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.900 182627 DEBUG nova.virt.libvirt.host [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.901 182627 DEBUG nova.virt.libvirt.host [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.902 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.902 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.903 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.903 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.903 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.903 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.904 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.904 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.904 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.904 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.905 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.905 182627 DEBUG nova.virt.hardware [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.908 182627 DEBUG nova.virt.libvirt.vif [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-221184610',display_name='tempest-TestServerAdvancedOps-server-221184610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-221184610',id=156,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63313e52f3864087904d9eb367b6597c',ramdisk_id='',reservation_id='r-zz24bxb1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-974440594',owner_user_name='tempest-TestServerAdvancedOps-974440594-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:34Z,user_data=None,user_id='12468cd86a594cc5ba37213d454f45c8',uuid=17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.909 182627 DEBUG nova.network.os_vif_util [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converting VIF {"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.909 182627 DEBUG nova.network.os_vif_util [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.910 182627 DEBUG nova.objects.instance [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'pci_devices' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.945 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <uuid>17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6</uuid>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <name>instance-0000009c</name>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestServerAdvancedOps-server-221184610</nova:name>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:46:38</nova:creationTime>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:user uuid="12468cd86a594cc5ba37213d454f45c8">tempest-TestServerAdvancedOps-974440594-project-member</nova:user>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:project uuid="63313e52f3864087904d9eb367b6597c">tempest-TestServerAdvancedOps-974440594</nova:project>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        <nova:port uuid="0e0ff891-b3da-4f36-87e5-d9e183dd922a">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="serial">17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="uuid">17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.config"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:bb:d4:40"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <target dev="tap0e0ff891-b3"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/console.log" append="off"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:46:38 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:46:38 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:46:38 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:46:38 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.947 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Preparing to wait for external event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.948 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.948 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.948 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.949 182627 DEBUG nova.virt.libvirt.vif [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:46:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-221184610',display_name='tempest-TestServerAdvancedOps-server-221184610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-221184610',id=156,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63313e52f3864087904d9eb367b6597c',ramdisk_id='',reservation_id='r-zz24bxb1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-974440594',owner_user_name='tempest-TestServerAdvancedOps-974440594-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:46:34Z,user_data=None,user_id='12468cd86a594cc5ba37213d454f45c8',uuid=17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.949 182627 DEBUG nova.network.os_vif_util [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converting VIF {"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.950 182627 DEBUG nova.network.os_vif_util [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.950 182627 DEBUG os_vif [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.951 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.951 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.952 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.954 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.954 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e0ff891-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.955 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e0ff891-b3, col_values=(('external_ids', {'iface-id': '0e0ff891-b3da-4f36-87e5-d9e183dd922a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:d4:40', 'vm-uuid': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:38 np0005592767 NetworkManager[54973]: <info>  [1769121998.9584] manager: (tap0e0ff891-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:38 np0005592767 nova_compute[182623]: 2026-01-22 22:46:38.970 182627 INFO os_vif [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3')#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.031 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.031 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.032 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] No VIF found with MAC fa:16:3e:bb:d4:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.032 182627 INFO nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Using config drive#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.516 182627 INFO nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Creating config drive at /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.config#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.521 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0s4uwi8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.645 182627 DEBUG oslo_concurrency.processutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa0s4uwi8" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:46:39 np0005592767 kernel: tap0e0ff891-b3: entered promiscuous mode
Jan 22 17:46:39 np0005592767 NetworkManager[54973]: <info>  [1769121999.7432] manager: (tap0e0ff891-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Jan 22 17:46:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:39Z|00679|binding|INFO|Claiming lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a for this chassis.
Jan 22 17:46:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:39Z|00680|binding|INFO|0e0ff891-b3da-4f36-87e5-d9e183dd922a: Claiming fa:16:3e:bb:d4:40 10.100.0.11
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.742 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:39.756 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:39.760 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 bound to our chassis#033[00m
Jan 22 17:46:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:39.762 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:46:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:39.763 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9816265e-168e-4b9f-8f65-ed61baa1c4bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.782 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:39Z|00681|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a ovn-installed in OVS
Jan 22 17:46:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:39Z|00682|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a up in Southbound
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:39 np0005592767 systemd-udevd[235622]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:46:39 np0005592767 systemd-machined[153912]: New machine qemu-83-instance-0000009c.
Jan 22 17:46:39 np0005592767 NetworkManager[54973]: <info>  [1769121999.8113] device (tap0e0ff891-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:46:39 np0005592767 NetworkManager[54973]: <info>  [1769121999.8124] device (tap0e0ff891-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:46:39 np0005592767 systemd[1]: Started Virtual Machine qemu-83-instance-0000009c.
Jan 22 17:46:39 np0005592767 nova_compute[182623]: 2026-01-22 22:46:39.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.428 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122000.4276342, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.428 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Started (Lifecycle Event)#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.446 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.451 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122000.4317307, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.451 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.466 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.471 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:40 np0005592767 nova_compute[182623]: 2026-01-22 22:46:40.494 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.285 182627 DEBUG nova.compute.manager [req-5051e459-a9a9-486a-851e-e7f4778464ff req-8a6ce8ba-5d6e-440e-8eaa-dad20c778e8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.285 182627 DEBUG oslo_concurrency.lockutils [req-5051e459-a9a9-486a-851e-e7f4778464ff req-8a6ce8ba-5d6e-440e-8eaa-dad20c778e8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.285 182627 DEBUG oslo_concurrency.lockutils [req-5051e459-a9a9-486a-851e-e7f4778464ff req-8a6ce8ba-5d6e-440e-8eaa-dad20c778e8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.286 182627 DEBUG oslo_concurrency.lockutils [req-5051e459-a9a9-486a-851e-e7f4778464ff req-8a6ce8ba-5d6e-440e-8eaa-dad20c778e8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.286 182627 DEBUG nova.compute.manager [req-5051e459-a9a9-486a-851e-e7f4778464ff req-8a6ce8ba-5d6e-440e-8eaa-dad20c778e8c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Processing event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.287 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.291 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122001.2910867, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.291 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.294 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.299 182627 INFO nova.virt.libvirt.driver [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance spawned successfully.#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.299 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.324 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.331 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.335 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.335 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.336 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.336 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.336 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.337 182627 DEBUG nova.virt.libvirt.driver [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.361 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.414 182627 INFO nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Took 7.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.415 182627 DEBUG nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.499 182627 INFO nova.compute.manager [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Took 7.86 seconds to build instance.#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.530 182627 DEBUG oslo_concurrency.lockutils [None req-f8f86d3c-f96e-4563-97c1-c99afc1ab130 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.901 182627 DEBUG nova.network.neutron [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updated VIF entry in instance network info cache for port 0e0ff891-b3da-4f36-87e5-d9e183dd922a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.902 182627 DEBUG nova.network.neutron [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updating instance_info_cache with network_info: [{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:46:41 np0005592767 nova_compute[182623]: 2026-01-22 22:46:41.928 182627 DEBUG oslo_concurrency.lockutils [req-e92d9eac-84af-4d36-84c1-94814212fbce req-f40949ad-4c1a-4bc7-b16a-d82cd1ef625d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:46:42 np0005592767 podman[235639]: 2026-01-22 22:46:42.195220767 +0000 UTC m=+0.096434239 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:46:42 np0005592767 podman[235638]: 2026-01-22 22:46:42.261033569 +0000 UTC m=+0.166162812 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:46:42 np0005592767 nova_compute[182623]: 2026-01-22 22:46:42.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.432 182627 DEBUG nova.compute.manager [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.433 182627 DEBUG oslo_concurrency.lockutils [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.433 182627 DEBUG oslo_concurrency.lockutils [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.434 182627 DEBUG oslo_concurrency.lockutils [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.434 182627 DEBUG nova.compute.manager [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.434 182627 WARNING nova.compute.manager [req-d10aed69-504a-46ed-8bc7-eba7e146d428 req-f1d47b03-3d8e-40a0-a6e8-caef9461a51d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:46:43 np0005592767 nova_compute[182623]: 2026-01-22 22:46:43.959 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.395 182627 DEBUG nova.objects.instance [None req-f0597f76-87be-47f2-a3c4-747635e7e40f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'pci_devices' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.420 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122004.4197834, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.420 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.443 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.449 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:44 np0005592767 nova_compute[182623]: 2026-01-22 22:46:44.478 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:46:44 np0005592767 kernel: tap0e0ff891-b3 (unregistering): left promiscuous mode
Jan 22 17:46:44 np0005592767 NetworkManager[54973]: <info>  [1769122004.9992] device (tap0e0ff891-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:46:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:45Z|00683|binding|INFO|Releasing lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a from this chassis (sb_readonly=0)
Jan 22 17:46:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:45Z|00684|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a down in Southbound
Jan 22 17:46:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:45Z|00685|binding|INFO|Removing iface tap0e0ff891-b3 ovn-installed in OVS
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.007 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:45.021 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:45.023 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 unbound from our chassis#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:45.025 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:46:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:45.027 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5d2eb5-4c90-48c0-a9aa-0820a52c4b23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:46:45 np0005592767 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 17:46:45 np0005592767 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009c.scope: Consumed 3.905s CPU time.
Jan 22 17:46:45 np0005592767 systemd-machined[153912]: Machine qemu-83-instance-0000009c terminated.
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.196 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.236 182627 DEBUG nova.compute.manager [None req-f0597f76-87be-47f2-a3c4-747635e7e40f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.545 182627 DEBUG nova.compute.manager [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.546 182627 DEBUG oslo_concurrency.lockutils [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.546 182627 DEBUG oslo_concurrency.lockutils [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.546 182627 DEBUG oslo_concurrency.lockutils [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.546 182627 DEBUG nova.compute.manager [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:45 np0005592767 nova_compute[182623]: 2026-01-22 22:46:45.547 182627 WARNING nova.compute.manager [req-dba67af3-f617-4faa-85db-9fd8a22ad08e req-b28a9c2b-5887-4a7c-b22c-e9dadc145392 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.690 182627 DEBUG nova.compute.manager [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.691 182627 DEBUG oslo_concurrency.lockutils [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.691 182627 DEBUG oslo_concurrency.lockutils [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.691 182627 DEBUG oslo_concurrency.lockutils [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.691 182627 DEBUG nova.compute.manager [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.692 182627 WARNING nova.compute.manager [req-c7aafec3-f4a9-48b7-bc57-998bb0b43954 req-c7833f3b-2d15-436a-8fae-3e6c85cd4b16 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.924 182627 INFO nova.compute.manager [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Resuming#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.925 182627 DEBUG nova.objects.instance [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'flavor' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.980 182627 DEBUG oslo_concurrency.lockutils [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.980 182627 DEBUG oslo_concurrency.lockutils [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquired lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:46:47 np0005592767 nova_compute[182623]: 2026-01-22 22:46:47.981 182627 DEBUG nova.network.neutron [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:46:48 np0005592767 nova_compute[182623]: 2026-01-22 22:46:48.962 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 podman[235709]: 2026-01-22 22:46:50.166473051 +0000 UTC m=+0.081143196 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:46:50 np0005592767 podman[235710]: 2026-01-22 22:46:50.185546421 +0000 UTC m=+0.091508619 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.188 182627 DEBUG nova.network.neutron [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updating instance_info_cache with network_info: [{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.204 182627 DEBUG oslo_concurrency.lockutils [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Releasing lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.211 182627 DEBUG nova.virt.libvirt.vif [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:46:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-221184610',display_name='tempest-TestServerAdvancedOps-server-221184610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-221184610',id=156,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:46:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='63313e52f3864087904d9eb367b6597c',ramdisk_id='',reservation_id='r-zz24bxb1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-974440594',owner_user_name='tempest-TestServerAdvancedOps-974440594-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:46:45Z,user_data=None,user_id='12468cd86a594cc5ba37213d454f45c8',uuid=17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.212 182627 DEBUG nova.network.os_vif_util [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converting VIF {"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.213 182627 DEBUG nova.network.os_vif_util [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.214 182627 DEBUG os_vif [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.215 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.215 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.216 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.219 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.220 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e0ff891-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.220 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e0ff891-b3, col_values=(('external_ids', {'iface-id': '0e0ff891-b3da-4f36-87e5-d9e183dd922a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:d4:40', 'vm-uuid': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.221 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.222 182627 INFO os_vif [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3')#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.253 182627 DEBUG nova.objects.instance [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'numa_topology' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:50 np0005592767 kernel: tap0e0ff891-b3: entered promiscuous mode
Jan 22 17:46:50 np0005592767 NetworkManager[54973]: <info>  [1769122010.3467] manager: (tap0e0ff891-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 22 17:46:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:50Z|00686|binding|INFO|Claiming lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a for this chassis.
Jan 22 17:46:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:50Z|00687|binding|INFO|0e0ff891-b3da-4f36-87e5-d9e183dd922a: Claiming fa:16:3e:bb:d4:40 10.100.0.11
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.347 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:50.352 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:50.354 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 bound to our chassis#033[00m
Jan 22 17:46:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:50.355 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:46:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:50.356 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6da69d27-a786-42f5-b80f-0af144847c0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:50Z|00688|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a ovn-installed in OVS
Jan 22 17:46:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:50Z|00689|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a up in Southbound
Jan 22 17:46:50 np0005592767 nova_compute[182623]: 2026-01-22 22:46:50.362 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:50 np0005592767 systemd-udevd[235766]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:46:50 np0005592767 systemd-machined[153912]: New machine qemu-84-instance-0000009c.
Jan 22 17:46:50 np0005592767 NetworkManager[54973]: <info>  [1769122010.3940] device (tap0e0ff891-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:46:50 np0005592767 NetworkManager[54973]: <info>  [1769122010.3946] device (tap0e0ff891-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:46:50 np0005592767 systemd[1]: Started Virtual Machine qemu-84-instance-0000009c.
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.297 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.298 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122011.2971437, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.298 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Started (Lifecycle Event)#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.319 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.320 182627 DEBUG nova.compute.manager [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.320 182627 DEBUG nova.objects.instance [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'pci_devices' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.323 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.344 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.345 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122011.3045833, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.346 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.353 182627 INFO nova.virt.libvirt.driver [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance running successfully.#033[00m
Jan 22 17:46:51 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.357 182627 DEBUG nova.virt.libvirt.guest [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.357 182627 DEBUG nova.compute.manager [None req-bde46fab-927f-4c4b-9c64-1ac1d639c30f 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.368 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.371 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.404 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.967 182627 DEBUG nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.968 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.968 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.968 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.968 182627 DEBUG nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.969 182627 WARNING nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.969 182627 DEBUG nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.969 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.969 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.969 182627 DEBUG oslo_concurrency.lockutils [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.970 182627 DEBUG nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:51 np0005592767 nova_compute[182623]: 2026-01-22 22:46:51.970 182627 WARNING nova.compute.manager [req-7d486d2c-2b6b-4f1f-b9fd-200b8fb064f2 req-e81b4ab9-d135-4822-8366-a43d5735d572 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state active and task_state None.#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.294 182627 DEBUG nova.objects.instance [None req-ec2b007c-76b9-42ca-9acb-479ac93560c2 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'pci_devices' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.313 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122012.312975, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.313 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.333 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.336 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.360 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.440 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:52 np0005592767 kernel: tap0e0ff891-b3 (unregistering): left promiscuous mode
Jan 22 17:46:52 np0005592767 NetworkManager[54973]: <info>  [1769122012.8394] device (tap0e0ff891-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.887 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:52Z|00690|binding|INFO|Releasing lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a from this chassis (sb_readonly=0)
Jan 22 17:46:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:52Z|00691|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a down in Southbound
Jan 22 17:46:52 np0005592767 ovn_controller[94769]: 2026-01-22T22:46:52Z|00692|binding|INFO|Removing iface tap0e0ff891-b3 ovn-installed in OVS
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:52.896 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:52.898 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 unbound from our chassis#033[00m
Jan 22 17:46:52 np0005592767 nova_compute[182623]: 2026-01-22 22:46:52.900 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:52.900 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:46:52 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:52.901 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3d299381-a49a-4420-a885-e157e1bc62c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:46:52 np0005592767 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 17:46:52 np0005592767 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d0000009c.scope: Consumed 1.965s CPU time.
Jan 22 17:46:52 np0005592767 systemd-machined[153912]: Machine qemu-84-instance-0000009c terminated.
Jan 22 17:46:53 np0005592767 NetworkManager[54973]: <info>  [1769122013.0350] manager: (tap0e0ff891-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/315)
Jan 22 17:46:53 np0005592767 nova_compute[182623]: 2026-01-22 22:46:53.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:53 np0005592767 nova_compute[182623]: 2026-01-22 22:46:53.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:53 np0005592767 nova_compute[182623]: 2026-01-22 22:46:53.080 182627 DEBUG nova.compute.manager [None req-ec2b007c-76b9-42ca-9acb-479ac93560c2 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:46:53 np0005592767 nova_compute[182623]: 2026-01-22 22:46:53.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.125 182627 DEBUG nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.125 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.126 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.126 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.126 182627 DEBUG nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.127 182627 WARNING nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.127 182627 DEBUG nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.127 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.128 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.128 182627 DEBUG oslo_concurrency.lockutils [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.128 182627 DEBUG nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:46:54 np0005592767 nova_compute[182623]: 2026-01-22 22:46:54.129 182627 WARNING nova.compute.manager [req-959678cd-a0e2-4619-9a5a-b8bfec6bed9b req-4ca11e48-4552-4923-8998-cd786fe021b3 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state suspended and task_state None.#033[00m
Jan 22 17:46:55 np0005592767 nova_compute[182623]: 2026-01-22 22:46:55.644 182627 INFO nova.compute.manager [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Resuming#033[00m
Jan 22 17:46:55 np0005592767 nova_compute[182623]: 2026-01-22 22:46:55.646 182627 DEBUG nova.objects.instance [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'flavor' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:46:55 np0005592767 nova_compute[182623]: 2026-01-22 22:46:55.705 182627 DEBUG oslo_concurrency.lockutils [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:46:55 np0005592767 nova_compute[182623]: 2026-01-22 22:46:55.706 182627 DEBUG oslo_concurrency.lockutils [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquired lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:46:55 np0005592767 nova_compute[182623]: 2026-01-22 22:46:55.706 182627 DEBUG nova.network.neutron [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:46:56 np0005592767 podman[235805]: 2026-01-22 22:46:56.195781037 +0000 UTC m=+0.100320879 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:46:56 np0005592767 nova_compute[182623]: 2026-01-22 22:46:56.957 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:56.957 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:46:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:56.960 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:46:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:46:56.962 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:46:57 np0005592767 nova_compute[182623]: 2026-01-22 22:46:57.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:46:58 np0005592767 nova_compute[182623]: 2026-01-22 22:46:58.969 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:00 np0005592767 nova_compute[182623]: 2026-01-22 22:47:00.969 182627 DEBUG nova.network.neutron [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updating instance_info_cache with network_info: [{"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.005 182627 DEBUG oslo_concurrency.lockutils [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Releasing lock "refresh_cache-17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.012 182627 DEBUG nova.virt.libvirt.vif [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:46:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-221184610',display_name='tempest-TestServerAdvancedOps-server-221184610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-221184610',id=156,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:46:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='63313e52f3864087904d9eb367b6597c',ramdisk_id='',reservation_id='r-zz24bxb1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-974440594',owner_user_name='tempest-TestServerAdvancedOps-974440594-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:46:53Z,user_data=None,user_id='12468cd86a594cc5ba37213d454f45c8',uuid=17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.012 182627 DEBUG nova.network.os_vif_util [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converting VIF {"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.014 182627 DEBUG nova.network.os_vif_util [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.014 182627 DEBUG os_vif [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.015 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.015 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.016 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.020 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.021 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e0ff891-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.021 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e0ff891-b3, col_values=(('external_ids', {'iface-id': '0e0ff891-b3da-4f36-87e5-d9e183dd922a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:d4:40', 'vm-uuid': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.022 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.023 182627 INFO os_vif [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3')#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.061 182627 DEBUG nova.objects.instance [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'numa_topology' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:47:01 np0005592767 kernel: tap0e0ff891-b3: entered promiscuous mode
Jan 22 17:47:01 np0005592767 NetworkManager[54973]: <info>  [1769122021.1235] manager: (tap0e0ff891-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Jan 22 17:47:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:01Z|00693|binding|INFO|Claiming lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a for this chassis.
Jan 22 17:47:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:01Z|00694|binding|INFO|0e0ff891-b3da-4f36-87e5-d9e183dd922a: Claiming fa:16:3e:bb:d4:40 10.100.0.11
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.127 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:01.130 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:47:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:01.131 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 bound to our chassis#033[00m
Jan 22 17:47:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:01.132 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:47:01 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:01.133 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2a93a344-4a09-46e8-88d7-2d903fe6ef49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.137 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:01Z|00695|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a ovn-installed in OVS
Jan 22 17:47:01 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:01Z|00696|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a up in Southbound
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:01 np0005592767 systemd-udevd[235844]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:47:01 np0005592767 systemd-machined[153912]: New machine qemu-85-instance-0000009c.
Jan 22 17:47:01 np0005592767 NetworkManager[54973]: <info>  [1769122021.1730] device (tap0e0ff891-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:47:01 np0005592767 NetworkManager[54973]: <info>  [1769122021.1735] device (tap0e0ff891-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:47:01 np0005592767 systemd[1]: Started Virtual Machine qemu-85-instance-0000009c.
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.674 182627 DEBUG nova.compute.manager [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.675 182627 DEBUG oslo_concurrency.lockutils [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.675 182627 DEBUG oslo_concurrency.lockutils [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.675 182627 DEBUG oslo_concurrency.lockutils [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.675 182627 DEBUG nova.compute.manager [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.675 182627 WARNING nova.compute.manager [req-0e428a31-62dd-43cc-bfac-33c86b5ec2b7 req-dc911304-8b21-4bbb-8a77-b84f3e76b590 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state suspended and task_state resuming.#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.815 182627 DEBUG nova.virt.libvirt.host [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Removed pending event for 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.816 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122021.8148198, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.816 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Started (Lifecycle Event)#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.852 182627 DEBUG nova.compute.manager [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.853 182627 DEBUG nova.objects.instance [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'pci_devices' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.884 182627 INFO nova.virt.libvirt.driver [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance running successfully.#033[00m
Jan 22 17:47:01 np0005592767 virtqemud[182095]: argument unsupported: QEMU guest agent is not configured
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.887 182627 DEBUG nova.virt.libvirt.guest [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.888 182627 DEBUG nova.compute.manager [None req-c064b44c-470c-40d3-aae3-1e3a2d6f986e 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.897 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.902 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.971 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.972 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122021.82005, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:01 np0005592767 nova_compute[182623]: 2026-01-22 22:47:01.972 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:47:02 np0005592767 nova_compute[182623]: 2026-01-22 22:47:02.024 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:02 np0005592767 nova_compute[182623]: 2026-01-22 22:47:02.028 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:47:02 np0005592767 nova_compute[182623]: 2026-01-22 22:47:02.445 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.627 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.628 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.628 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.629 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.629 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.643 182627 INFO nova.compute.manager [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Terminating instance#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.653 182627 DEBUG nova.compute.manager [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:47:03 np0005592767 kernel: tap0e0ff891-b3 (unregistering): left promiscuous mode
Jan 22 17:47:03 np0005592767 NetworkManager[54973]: <info>  [1769122023.6748] device (tap0e0ff891-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:47:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:03Z|00697|binding|INFO|Releasing lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a from this chassis (sb_readonly=0)
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.680 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:03Z|00698|binding|INFO|Setting lport 0e0ff891-b3da-4f36-87e5-d9e183dd922a down in Southbound
Jan 22 17:47:03 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:03Z|00699|binding|INFO|Removing iface tap0e0ff891-b3 ovn-installed in OVS
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.682 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:03.696 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d4:40 10.100.0.11'], port_security=['fa:16:3e:bb:d4:40 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36fe4bac-181e-4bfc-86f7-eae4a762b7a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63313e52f3864087904d9eb367b6597c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7f64a1a8-1f1f-42f1-8371-6c9e7f4e6248', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9bac3e2c-a40d-4e70-a8f4-b5b18df06d99, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=0e0ff891-b3da-4f36-87e5-d9e183dd922a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:47:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:03.699 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0ff891-b3da-4f36-87e5-d9e183dd922a in datapath 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 unbound from our chassis#033[00m
Jan 22 17:47:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:03.701 104135 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 36fe4bac-181e-4bfc-86f7-eae4a762b7a1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 22 17:47:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:03.702 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c10c0254-6fe6-403d-9aea-622363d6c1c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:03 np0005592767 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 17:47:03 np0005592767 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000009c.scope: Consumed 2.499s CPU time.
Jan 22 17:47:03 np0005592767 systemd-machined[153912]: Machine qemu-85-instance-0000009c terminated.
Jan 22 17:47:03 np0005592767 NetworkManager[54973]: <info>  [1769122023.8732] manager: (tap0e0ff891-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.916 182627 INFO nova.virt.libvirt.driver [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Instance destroyed successfully.#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.917 182627 DEBUG nova.objects.instance [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lazy-loading 'resources' on Instance uuid 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.934 182627 DEBUG nova.compute.manager [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.934 182627 DEBUG oslo_concurrency.lockutils [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.935 182627 DEBUG oslo_concurrency.lockutils [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.935 182627 DEBUG oslo_concurrency.lockutils [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.936 182627 DEBUG nova.compute.manager [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.936 182627 WARNING nova.compute.manager [req-0c79bea7-3170-4b99-8972-340a8e6d5d7e req-a4a30681-252d-4977-b3ac-c2e62ef8b785 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.972 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.983 182627 DEBUG nova.virt.libvirt.vif [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:46:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-221184610',display_name='tempest-TestServerAdvancedOps-server-221184610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-221184610',id=156,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:46:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63313e52f3864087904d9eb367b6597c',ramdisk_id='',reservation_id='r-zz24bxb1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-974440594',owner_user_name='tempest-TestServerAdvancedOps-974440594-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:47:01Z,user_data=None,user_id='12468cd86a594cc5ba37213d454f45c8',uuid=17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.984 182627 DEBUG nova.network.os_vif_util [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converting VIF {"id": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "address": "fa:16:3e:bb:d4:40", "network": {"id": "36fe4bac-181e-4bfc-86f7-eae4a762b7a1", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-963994317-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "63313e52f3864087904d9eb367b6597c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0ff891-b3", "ovs_interfaceid": "0e0ff891-b3da-4f36-87e5-d9e183dd922a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.985 182627 DEBUG nova.network.os_vif_util [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.986 182627 DEBUG os_vif [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.989 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.990 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e0ff891-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.993 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.996 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.998 182627 INFO os_vif [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:d4:40,bridge_name='br-int',has_traffic_filtering=True,id=0e0ff891-b3da-4f36-87e5-d9e183dd922a,network=Network(36fe4bac-181e-4bfc-86f7-eae4a762b7a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0ff891-b3')#033[00m
Jan 22 17:47:03 np0005592767 nova_compute[182623]: 2026-01-22 22:47:03.999 182627 INFO nova.virt.libvirt.driver [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Deleting instance files /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6_del#033[00m
Jan 22 17:47:04 np0005592767 nova_compute[182623]: 2026-01-22 22:47:04.000 182627 INFO nova.virt.libvirt.driver [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Deletion of /var/lib/nova/instances/17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6_del complete#033[00m
Jan 22 17:47:04 np0005592767 nova_compute[182623]: 2026-01-22 22:47:04.110 182627 INFO nova.compute.manager [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:47:04 np0005592767 nova_compute[182623]: 2026-01-22 22:47:04.111 182627 DEBUG oslo.service.loopingcall [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:47:04 np0005592767 nova_compute[182623]: 2026-01-22 22:47:04.111 182627 DEBUG nova.compute.manager [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:47:04 np0005592767 nova_compute[182623]: 2026-01-22 22:47:04.111 182627 DEBUG nova.network.neutron [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:47:05 np0005592767 nova_compute[182623]: 2026-01-22 22:47:05.973 182627 DEBUG nova.network.neutron [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:05 np0005592767 nova_compute[182623]: 2026-01-22 22:47:05.997 182627 INFO nova.compute.manager [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Took 1.89 seconds to deallocate network for instance.#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.089 182627 DEBUG nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.090 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.090 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.091 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.092 182627 DEBUG nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.092 182627 DEBUG nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-unplugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.092 182627 DEBUG nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.093 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.094 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.094 182627 DEBUG oslo_concurrency.lockutils [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.095 182627 DEBUG nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] No waiting events found dispatching network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.095 182627 WARNING nova.compute.manager [req-8cb51b92-c80b-4c47-b87f-83274a5cbc7e req-ed174f30-12f4-4207-99d3-f14160d3edcb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received unexpected event network-vif-plugged-0e0ff891-b3da-4f36-87e5-d9e183dd922a for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.108 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.109 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.194 182627 DEBUG nova.compute.provider_tree [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.213 182627 DEBUG nova.scheduler.client.report [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.239 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.291 182627 INFO nova.scheduler.client.report [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Deleted allocations for instance 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6#033[00m
Jan 22 17:47:06 np0005592767 nova_compute[182623]: 2026-01-22 22:47:06.422 182627 DEBUG oslo_concurrency.lockutils [None req-aecd6b52-409e-49a8-99ee-189b997366b8 12468cd86a594cc5ba37213d454f45c8 63313e52f3864087904d9eb367b6597c - - default default] Lock "17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:07 np0005592767 nova_compute[182623]: 2026-01-22 22:47:07.447 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:07 np0005592767 nova_compute[182623]: 2026-01-22 22:47:07.559 182627 DEBUG nova.compute.manager [req-09776500-1443-4ff4-8991-73d9048cdc44 req-89179d79-c42e-4be7-b64a-808d0a121447 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Received event network-vif-deleted-0e0ff891-b3da-4f36-87e5-d9e183dd922a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:07 np0005592767 nova_compute[182623]: 2026-01-22 22:47:07.651 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:08 np0005592767 podman[235883]: 2026-01-22 22:47:08.166769789 +0000 UTC m=+0.077801482 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:47:08 np0005592767 nova_compute[182623]: 2026-01-22 22:47:08.994 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:12.118 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:12 np0005592767 nova_compute[182623]: 2026-01-22 22:47:12.449 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:13 np0005592767 podman[235906]: 2026-01-22 22:47:13.181069592 +0000 UTC m=+0.096444329 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 22 17:47:13 np0005592767 podman[235905]: 2026-01-22 22:47:13.20961787 +0000 UTC m=+0.131830620 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 22 17:47:13 np0005592767 nova_compute[182623]: 2026-01-22 22:47:13.997 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:17 np0005592767 nova_compute[182623]: 2026-01-22 22:47:17.452 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:18 np0005592767 nova_compute[182623]: 2026-01-22 22:47:18.915 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122023.9129038, 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:18 np0005592767 nova_compute[182623]: 2026-01-22 22:47:18.915 182627 INFO nova.compute.manager [-] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:47:18 np0005592767 nova_compute[182623]: 2026-01-22 22:47:18.937 182627 DEBUG nova.compute.manager [None req-e8217827-9386-4dea-ba12-4d056fc1150d - - - - - -] [instance: 17f5ec4a-4e21-4c1e-90be-0e902bfb7ee6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:19 np0005592767 nova_compute[182623]: 2026-01-22 22:47:19.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:21 np0005592767 podman[235952]: 2026-01-22 22:47:21.17898275 +0000 UTC m=+0.084945264 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:47:21 np0005592767 podman[235953]: 2026-01-22 22:47:21.179445153 +0000 UTC m=+0.084214953 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:47:22 np0005592767 nova_compute[182623]: 2026-01-22 22:47:22.454 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:24 np0005592767 nova_compute[182623]: 2026-01-22 22:47:24.003 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.083 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.084 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.106 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.238 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.238 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.245 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.245 182627 INFO nova.compute.claims [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.395 182627 DEBUG nova.compute.provider_tree [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.410 182627 DEBUG nova.scheduler.client.report [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.429 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.429 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.495 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.495 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.516 182627 INFO nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.552 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.701 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.703 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.704 182627 INFO nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Creating image(s)#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.705 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.706 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.707 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.732 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.816 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.817 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.818 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.828 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.883 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.884 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.921 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.922 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.923 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.943 182627 DEBUG nova.policy [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.979 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.980 182627 DEBUG nova.virt.disk.api [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:47:26 np0005592767 nova_compute[182623]: 2026-01-22 22:47:26.980 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.033 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.035 182627 DEBUG nova.virt.disk.api [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.036 182627 DEBUG nova.objects.instance [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.051 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.052 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Ensure instance console log exists: /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.053 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.054 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.054 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:27 np0005592767 podman[236008]: 2026-01-22 22:47:27.157100977 +0000 UTC m=+0.079424397 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:47:27 np0005592767 nova_compute[182623]: 2026-01-22 22:47:27.457 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:28 np0005592767 nova_compute[182623]: 2026-01-22 22:47:28.104 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Successfully created port: 2b7b67c3-2e63-4c82-aef2-a408a460e74b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:47:28 np0005592767 nova_compute[182623]: 2026-01-22 22:47:28.870 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Successfully created port: b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:47:28 np0005592767 nova_compute[182623]: 2026-01-22 22:47:28.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.007 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.769 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Successfully updated port: 2b7b67c3-2e63-4c82-aef2-a408a460e74b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.981 182627 DEBUG nova.compute.manager [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.982 182627 DEBUG nova.compute.manager [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing instance network info cache due to event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.982 182627 DEBUG oslo_concurrency.lockutils [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.983 182627 DEBUG oslo_concurrency.lockutils [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:47:29 np0005592767 nova_compute[182623]: 2026-01-22 22:47:29.983 182627 DEBUG nova.network.neutron [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing network info cache for port 2b7b67c3-2e63-4c82-aef2-a408a460e74b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.238 182627 DEBUG nova.network.neutron [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.918 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.919 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.919 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.920 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.944 182627 DEBUG nova.network.neutron [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:30 np0005592767 nova_compute[182623]: 2026-01-22 22:47:30.979 182627 DEBUG oslo_concurrency.lockutils [req-51e04417-48ab-462a-a207-51d1ff1112f8 req-ad85755e-9cc4-443c-81a9-4939a5f91286 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:47:31 np0005592767 nova_compute[182623]: 2026-01-22 22:47:31.287 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Successfully updated port: b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:47:31 np0005592767 nova_compute[182623]: 2026-01-22 22:47:31.305 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:47:31 np0005592767 nova_compute[182623]: 2026-01-22 22:47:31.306 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:47:31 np0005592767 nova_compute[182623]: 2026-01-22 22:47:31.306 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:47:31 np0005592767 nova_compute[182623]: 2026-01-22 22:47:31.573 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:47:32 np0005592767 nova_compute[182623]: 2026-01-22 22:47:32.118 182627 DEBUG nova.compute.manager [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-changed-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:32 np0005592767 nova_compute[182623]: 2026-01-22 22:47:32.119 182627 DEBUG nova.compute.manager [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing instance network info cache due to event network-changed-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:47:32 np0005592767 nova_compute[182623]: 2026-01-22 22:47:32.119 182627 DEBUG oslo_concurrency.lockutils [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:47:32 np0005592767 nova_compute[182623]: 2026-01-22 22:47:32.457 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:33 np0005592767 nova_compute[182623]: 2026-01-22 22:47:33.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:33 np0005592767 nova_compute[182623]: 2026-01-22 22:47:33.923 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:33 np0005592767 nova_compute[182623]: 2026-01-22 22:47:33.924 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:33 np0005592767 nova_compute[182623]: 2026-01-22 22:47:33.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:33 np0005592767 nova_compute[182623]: 2026-01-22 22:47:33.925 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.009 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.169 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.170 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5698MB free_disk=73.11970138549805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.170 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.171 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.246 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.246 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.247 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.294 182627 DEBUG nova.network.neutron [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.315 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.315 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance network_info: |[{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.316 182627 DEBUG oslo_concurrency.lockutils [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.316 182627 DEBUG nova.network.neutron [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing network info cache for port b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.321 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Start _get_guest_xml network_info=[{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.327 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.332 182627 WARNING nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.339 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.347 182627 DEBUG nova.virt.libvirt.host [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.348 182627 DEBUG nova.virt.libvirt.host [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.352 182627 DEBUG nova.virt.libvirt.host [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.353 182627 DEBUG nova.virt.libvirt.host [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.356 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.356 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.357 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.357 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.358 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.358 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.358 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.359 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.359 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.359 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.360 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.360 182627 DEBUG nova.virt.hardware [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.366 182627 DEBUG nova.virt.libvirt.vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.367 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.368 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.369 182627 DEBUG nova.virt.libvirt.vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.369 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.370 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.371 182627 DEBUG nova.objects.instance [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.373 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.374 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.388 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <uuid>bb1cbc4a-90d9-439b-a2bf-0d4a3b533533</uuid>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <name>instance-0000009f</name>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestGettingAddress-server-856211129</nova:name>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:47:34</nova:creationTime>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:port uuid="2b7b67c3-2e63-4c82-aef2-a408a460e74b">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        <nova:port uuid="b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe1d:2f14" ipVersion="6"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe1d:2f14" ipVersion="6"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="serial">bb1cbc4a-90d9-439b-a2bf-0d4a3b533533</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="uuid">bb1cbc4a-90d9-439b-a2bf-0d4a3b533533</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.config"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:fa:f7:44"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <target dev="tap2b7b67c3-2e"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:1d:2f:14"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <target dev="tapb61acc56-ff"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/console.log" append="off"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:47:34 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:47:34 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:47:34 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:47:34 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.390 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Preparing to wait for external event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.391 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.392 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.392 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.393 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Preparing to wait for external event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.393 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.394 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.394 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.395 182627 DEBUG nova.virt.libvirt.vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.396 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.397 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.397 182627 DEBUG os_vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.398 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.399 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.400 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.404 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.405 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b7b67c3-2e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.405 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b7b67c3-2e, col_values=(('external_ids', {'iface-id': '2b7b67c3-2e63-4c82-aef2-a408a460e74b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:f7:44', 'vm-uuid': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.408 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 NetworkManager[54973]: <info>  [1769122054.4094] manager: (tap2b7b67c3-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.410 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.414 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.415 182627 INFO os_vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e')#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.416 182627 DEBUG nova.virt.libvirt.vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:47:26Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.417 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.418 182627 DEBUG nova.network.os_vif_util [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.418 182627 DEBUG os_vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.419 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.419 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.420 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.422 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb61acc56-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.423 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb61acc56-ff, col_values=(('external_ids', {'iface-id': 'b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:2f:14', 'vm-uuid': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 NetworkManager[54973]: <info>  [1769122054.4252] manager: (tapb61acc56-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.427 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.433 182627 INFO os_vif [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff')#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.488 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.489 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.489 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:fa:f7:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.489 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:1d:2f:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:47:34 np0005592767 nova_compute[182623]: 2026-01-22 22:47:34.490 182627 INFO nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Using config drive#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.124 182627 INFO nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Creating config drive at /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.config#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.134 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm9fty0j_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.265 182627 DEBUG oslo_concurrency.processutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm9fty0j_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:47:35 np0005592767 kernel: tap2b7b67c3-2e: entered promiscuous mode
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.3504] manager: (tap2b7b67c3-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00700|binding|INFO|Claiming lport 2b7b67c3-2e63-4c82-aef2-a408a460e74b for this chassis.
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.354 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00701|binding|INFO|2b7b67c3-2e63-4c82-aef2-a408a460e74b: Claiming fa:16:3e:fa:f7:44 10.100.0.4
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.361 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.374 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.375 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.3776] manager: (tapb61acc56-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Jan 22 17:47:35 np0005592767 kernel: tapb61acc56-ff: entered promiscuous mode
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.3810] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.379 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.3825] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 22 17:47:35 np0005592767 systemd-udevd[236059]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:47:35 np0005592767 systemd-udevd[236058]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.403 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:f7:44 10.100.0.4'], port_security=['fa:16:3e:fa:f7:44 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55b79226-17c7-4623-9f19-8585aca1b119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=297b2520-2860-4872-a497-3a3478b0820d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2b7b67c3-2e63-4c82-aef2-a408a460e74b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.405 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2b7b67c3-2e63-4c82-aef2-a408a460e74b in datapath 55b79226-17c7-4623-9f19-8585aca1b119 bound to our chassis#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.406 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55b79226-17c7-4623-9f19-8585aca1b119#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.418 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[13379151-958c-479a-a5f5-3ff8d6d39b03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.419 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55b79226-11 in ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.4201] device (tapb61acc56-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.4215] device (tapb61acc56-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.421 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55b79226-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.421 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[51ad920f-a76b-4554-a05a-b9e4248baac0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.4229] device (tap2b7b67c3-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.423 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a34c5005-3a61-4088-bf0d-6a45df7cafc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.4241] device (tap2b7b67c3-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.438 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[8a988fbc-dcba-4515-bbf9-087385965a55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 systemd-machined[153912]: New machine qemu-86-instance-0000009f.
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.465 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4d06ba78-196f-4d2d-b6a7-9bc4a91b0a1f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.492 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c765a522-c0c2-41d6-bc8f-7f527f8dbcd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.513 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c13413-6913-4ecc-a6a4-6a667525cb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.5203] manager: (tap55b79226-10): new Veth device (/org/freedesktop/NetworkManager/Devices/324)
Jan 22 17:47:35 np0005592767 systemd[1]: Started Virtual Machine qemu-86-instance-0000009f.
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.547 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4e4e6c-4a9c-4b4d-bd19-9dfd44161d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.553 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[19040382-767a-47e3-a3a0-5c3a42981768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.582 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[0e37de42-bc28-45e4-8fee-0e3e0a7fbff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.5927] device (tap55b79226-10): carrier: link connected
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.593 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.596 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.600 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e598f75f-dccb-464a-9365-6d98e2a5f770]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55b79226-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:9b:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565215, 'reachable_time': 19378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236094, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00702|binding|INFO|Claiming lport b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 for this chassis.
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00703|binding|INFO|b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9: Claiming fa:16:3e:1d:2f:14 2001:db8:0:1:f816:3eff:fe1d:2f14 2001:db8::f816:3eff:fe1d:2f14
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00704|binding|INFO|Setting lport 2b7b67c3-2e63-4c82-aef2-a408a460e74b ovn-installed in OVS
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.620 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[37f52cf2-738c-447c-9c24-815618933f63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:9b3d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565215, 'tstamp': 565215}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236096, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.622 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00705|binding|INFO|Setting lport 2b7b67c3-2e63-4c82-aef2-a408a460e74b up in Southbound
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.631 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:2f:14 2001:db8:0:1:f816:3eff:fe1d:2f14 2001:db8::f816:3eff:fe1d:2f14'], port_security=['fa:16:3e:1d:2f:14 2001:db8:0:1:f816:3eff:fe1d:2f14 2001:db8::f816:3eff:fe1d:2f14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe1d:2f14/64 2001:db8::f816:3eff:fe1d:2f14/64', 'neutron:device_id': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00706|binding|INFO|Setting lport b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 ovn-installed in OVS
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00707|binding|INFO|Setting lport b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 up in Southbound
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.640 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3e8369-ba90-4cc5-81e7-0535985d30b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55b79226-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:9b:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565215, 'reachable_time': 19378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236097, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.672 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecb1270-21af-493a-8cc4-c593b3e0d472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.735 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8af3e827-497a-4f31-91d7-0226f871e8ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55b79226-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.737 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.738 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55b79226-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.739 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 NetworkManager[54973]: <info>  [1769122055.7407] manager: (tap55b79226-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 22 17:47:35 np0005592767 kernel: tap55b79226-10: entered promiscuous mode
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.742 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.744 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55b79226-10, col_values=(('external_ids', {'iface-id': '781e89a1-cc2a-4079-9a3d-fcfacb1013c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:35Z|00708|binding|INFO|Releasing lport 781e89a1-cc2a-4079-9a3d-fcfacb1013c1 from this chassis (sb_readonly=0)
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.748 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.749 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[622ccba0-cb07-4e7e-bb9f-248e6c38ea84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.750 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-55b79226-17c7-4623-9f19-8585aca1b119
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/55b79226-17c7-4623-9f19-8585aca1b119.pid.haproxy
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 55b79226-17c7-4623-9f19-8585aca1b119
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:47:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:35.750 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'env', 'PROCESS_TAG=haproxy-55b79226-17c7-4623-9f19-8585aca1b119', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55b79226-17c7-4623-9f19-8585aca1b119.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.778 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122055.7780325, bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.779 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] VM Started (Lifecycle Event)#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.811 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.816 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122055.7782674, bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.817 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.960 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:35 np0005592767 nova_compute[182623]: 2026-01-22 22:47:35.971 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.000 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:47:36 np0005592767 podman[236137]: 2026-01-22 22:47:36.159645993 +0000 UTC m=+0.052753464 container create f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:47:36 np0005592767 systemd[1]: Started libpod-conmon-f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f.scope.
Jan 22 17:47:36 np0005592767 podman[236137]: 2026-01-22 22:47:36.134473491 +0000 UTC m=+0.027580962 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:47:36 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.236 182627 DEBUG nova.compute.manager [req-6c68cc8c-2b2f-4646-beb3-0492aa9b2c98 req-eefef9bb-f305-4bc7-b5c2-fc29e7bc7ba6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.237 182627 DEBUG oslo_concurrency.lockutils [req-6c68cc8c-2b2f-4646-beb3-0492aa9b2c98 req-eefef9bb-f305-4bc7-b5c2-fc29e7bc7ba6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.237 182627 DEBUG oslo_concurrency.lockutils [req-6c68cc8c-2b2f-4646-beb3-0492aa9b2c98 req-eefef9bb-f305-4bc7-b5c2-fc29e7bc7ba6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.237 182627 DEBUG oslo_concurrency.lockutils [req-6c68cc8c-2b2f-4646-beb3-0492aa9b2c98 req-eefef9bb-f305-4bc7-b5c2-fc29e7bc7ba6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.237 182627 DEBUG nova.compute.manager [req-6c68cc8c-2b2f-4646-beb3-0492aa9b2c98 req-eefef9bb-f305-4bc7-b5c2-fc29e7bc7ba6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Processing event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:47:36 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12bdf7dde623f40e2d6fb75b32ab206b4842cbd8da74619aaf2608e2c6a810e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:47:36 np0005592767 podman[236137]: 2026-01-22 22:47:36.249427632 +0000 UTC m=+0.142535133 container init f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:47:36 np0005592767 podman[236137]: 2026-01-22 22:47:36.254385442 +0000 UTC m=+0.147492913 container start f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:47:36 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [NOTICE]   (236157) : New worker (236159) forked
Jan 22 17:47:36 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [NOTICE]   (236157) : Loading success.
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.330 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 unbound from our chassis#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.333 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09b515c7-d044-43d4-b895-408eb5de1fd8#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.347 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ec87e607-b0ce-40a1-87d6-82352f00eb19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.348 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09b515c7-d1 in ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.352 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09b515c7-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.352 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[26dfad3e-06ea-4409-905d-a1554a8ec487]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.354 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[58c4f768-3cd5-4686-b1a9-3c269aff6f29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.370 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[aca7bf2d-5032-4576-ac29-fd6c9575bb58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.397 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b026e58a-4aad-4bf8-97cd-b8b5f11cca65]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.453 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5c384b-4cec-4cc6-b6e6-1462f05f471e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 systemd-udevd[236073]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:47:36 np0005592767 NetworkManager[54973]: <info>  [1769122056.4641] manager: (tap09b515c7-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/326)
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.462 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5339f73f-d0ca-43e8-9565-58ee69d0a3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.506 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[69471ffa-42d5-4ade-9ea3-a667f640bbc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.510 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[bca5b87a-4a51-4a0b-8818-3dd8d3326e20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 NetworkManager[54973]: <info>  [1769122056.5480] device (tap09b515c7-d0): carrier: link connected
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.557 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[ede991a7-a953-4eae-8062-a697666101cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.580 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[772e4d38-7f06-4a7a-8ff3-6bd9dd505e9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09b515c7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:57:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565312, 'reachable_time': 26545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236178, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.599 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3b4833fc-b719-4d6a-b9b9-d60657f1d4f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:579d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565312, 'tstamp': 565312}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236179, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.618 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5495d6ba-141d-4c1a-9054-ce68c8e155f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09b515c7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:57:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565312, 'reachable_time': 26545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236180, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.655 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bc46e44f-0a2c-4408-9f46-947d080fa493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.694 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4793e131-db70-4ca7-84d8-8fa752c87d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.696 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09b515c7-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.697 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.698 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09b515c7-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.700 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:36 np0005592767 kernel: tap09b515c7-d0: entered promiscuous mode
Jan 22 17:47:36 np0005592767 NetworkManager[54973]: <info>  [1769122056.7022] manager: (tap09b515c7-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.703 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.704 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09b515c7-d0, col_values=(('external_ids', {'iface-id': 'f20a608b-4dde-4090-8331-5a96db0eeb25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.706 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:36 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:36Z|00709|binding|INFO|Releasing lport f20a608b-4dde-4090-8331-5a96db0eeb25 from this chassis (sb_readonly=0)
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.721 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.722 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.724 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b042bf5c-2ab7-48a6-8537-0f77c6edbaf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.725 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-09b515c7-d044-43d4-b895-408eb5de1fd8
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/09b515c7-d044-43d4-b895-408eb5de1fd8.pid.haproxy
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 09b515c7-d044-43d4-b895-408eb5de1fd8
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:47:36 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:47:36.726 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'env', 'PROCESS_TAG=haproxy-09b515c7-d044-43d4-b895-408eb5de1fd8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09b515c7-d044-43d4-b895-408eb5de1fd8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.800 182627 DEBUG nova.network.neutron [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updated VIF entry in instance network info cache for port b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.800 182627 DEBUG nova.network.neutron [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:36 np0005592767 nova_compute[182623]: 2026-01-22 22:47:36.819 182627 DEBUG oslo_concurrency.lockutils [req-a5b05b12-9080-419d-a681-063c9c2a3316 req-94549838-b87d-4406-b28e-76a31bfca5d5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:47:37 np0005592767 podman[236210]: 2026-01-22 22:47:37.142412081 +0000 UTC m=+0.049122700 container create 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 17:47:37 np0005592767 systemd[1]: Started libpod-conmon-822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214.scope.
Jan 22 17:47:37 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:47:37 np0005592767 podman[236210]: 2026-01-22 22:47:37.117857797 +0000 UTC m=+0.024568436 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:47:37 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d50ff3701dcb0b9028a286e33d335870622f2c63c895972bf15521df5c91abc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:47:37 np0005592767 podman[236210]: 2026-01-22 22:47:37.245887708 +0000 UTC m=+0.152598427 container init 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 17:47:37 np0005592767 podman[236210]: 2026-01-22 22:47:37.254042329 +0000 UTC m=+0.160752988 container start 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 17:47:37 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [NOTICE]   (236230) : New worker (236232) forked
Jan 22 17:47:37 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [NOTICE]   (236230) : Loading success.
Jan 22 17:47:37 np0005592767 nova_compute[182623]: 2026-01-22 22:47:37.496 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:37 np0005592767 nova_compute[182623]: 2026-01-22 22:47:37.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:37 np0005592767 nova_compute[182623]: 2026-01-22 22:47:37.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.421 182627 DEBUG nova.compute.manager [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.421 182627 DEBUG oslo_concurrency.lockutils [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.421 182627 DEBUG oslo_concurrency.lockutils [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.422 182627 DEBUG oslo_concurrency.lockutils [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.422 182627 DEBUG nova.compute.manager [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No event matching network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 in dict_keys([('network-vif-plugged', '2b7b67c3-2e63-4c82-aef2-a408a460e74b')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.422 182627 WARNING nova.compute.manager [req-8813a182-7c50-46f2-8e2b-fa06e5b4d880 req-3c2fbb5f-8e31-4255-ba1c-691cd41020cb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received unexpected event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.669 182627 DEBUG nova.compute.manager [req-52b09c07-3529-4589-9064-b0847c853f3c req-d5a97fa1-62b9-436e-9e91-43143e0e5e51 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.669 182627 DEBUG oslo_concurrency.lockutils [req-52b09c07-3529-4589-9064-b0847c853f3c req-d5a97fa1-62b9-436e-9e91-43143e0e5e51 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.670 182627 DEBUG oslo_concurrency.lockutils [req-52b09c07-3529-4589-9064-b0847c853f3c req-d5a97fa1-62b9-436e-9e91-43143e0e5e51 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.670 182627 DEBUG oslo_concurrency.lockutils [req-52b09c07-3529-4589-9064-b0847c853f3c req-d5a97fa1-62b9-436e-9e91-43143e0e5e51 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.671 182627 DEBUG nova.compute.manager [req-52b09c07-3529-4589-9064-b0847c853f3c req-d5a97fa1-62b9-436e-9e91-43143e0e5e51 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Processing event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.672 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.678 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122058.6782641, bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.679 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.683 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.688 182627 INFO nova.virt.libvirt.driver [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance spawned successfully.#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.689 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.717 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.725 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.730 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.731 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.732 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.732 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.733 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.734 182627 DEBUG nova.virt.libvirt.driver [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.765 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.853 182627 INFO nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Took 12.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.854 182627 DEBUG nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.965 182627 INFO nova.compute.manager [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Took 12.78 seconds to build instance.#033[00m
Jan 22 17:47:38 np0005592767 nova_compute[182623]: 2026-01-22 22:47:38.988 182627 DEBUG oslo_concurrency.lockutils [None req-4c2344ba-a0d9-4b3b-a77a-b1561a54a53e 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:39 np0005592767 podman[236241]: 2026-01-22 22:47:39.224125454 +0000 UTC m=+0.124071520 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 22 17:47:39 np0005592767 nova_compute[182623]: 2026-01-22 22:47:39.426 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.894 182627 DEBUG nova.compute.manager [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.894 182627 DEBUG oslo_concurrency.lockutils [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.895 182627 DEBUG oslo_concurrency.lockutils [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.895 182627 DEBUG oslo_concurrency.lockutils [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.895 182627 DEBUG nova.compute.manager [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No waiting events found dispatching network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:47:40 np0005592767 nova_compute[182623]: 2026-01-22 22:47:40.895 182627 WARNING nova.compute.manager [req-c83ebcd4-2634-4444-b6fa-abf20a72ec27 req-400307b6-27fc-4c60-b9c8-9066a6f6139c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received unexpected event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b for instance with vm_state active and task_state None.#033[00m
Jan 22 17:47:41 np0005592767 nova_compute[182623]: 2026-01-22 22:47:41.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:42 np0005592767 nova_compute[182623]: 2026-01-22 22:47:42.498 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.140 182627 DEBUG nova.compute.manager [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.141 182627 DEBUG nova.compute.manager [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing instance network info cache due to event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.141 182627 DEBUG oslo_concurrency.lockutils [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.141 182627 DEBUG oslo_concurrency.lockutils [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.141 182627 DEBUG nova.network.neutron [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing network info cache for port 2b7b67c3-2e63-4c82-aef2-a408a460e74b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:47:43 np0005592767 nova_compute[182623]: 2026-01-22 22:47:43.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:47:44 np0005592767 podman[236266]: 2026-01-22 22:47:44.215975134 +0000 UTC m=+0.114000936 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350)
Jan 22 17:47:44 np0005592767 podman[236265]: 2026-01-22 22:47:44.242981378 +0000 UTC m=+0.148580104 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller)
Jan 22 17:47:44 np0005592767 nova_compute[182623]: 2026-01-22 22:47:44.428 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:45 np0005592767 nova_compute[182623]: 2026-01-22 22:47:45.943 182627 DEBUG nova.network.neutron [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updated VIF entry in instance network info cache for port 2b7b67c3-2e63-4c82-aef2-a408a460e74b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:47:45 np0005592767 nova_compute[182623]: 2026-01-22 22:47:45.946 182627 DEBUG nova.network.neutron [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:47:45 np0005592767 nova_compute[182623]: 2026-01-22 22:47:45.964 182627 DEBUG oslo_concurrency.lockutils [req-a8544d42-53fa-477a-a0e7-3918adaa0945 req-b6e9ab20-e5ed-4ec0-a63b-553c35e97826 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:47:47 np0005592767 nova_compute[182623]: 2026-01-22 22:47:47.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:49 np0005592767 nova_compute[182623]: 2026-01-22 22:47:49.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:51Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:f7:44 10.100.0.4
Jan 22 17:47:51 np0005592767 ovn_controller[94769]: 2026-01-22T22:47:51Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:f7:44 10.100.0.4
Jan 22 17:47:52 np0005592767 podman[236337]: 2026-01-22 22:47:52.147854915 +0000 UTC m=+0.060969856 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:47:52 np0005592767 podman[236336]: 2026-01-22 22:47:52.152260429 +0000 UTC m=+0.063088285 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:47:52 np0005592767 nova_compute[182623]: 2026-01-22 22:47:52.544 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:54 np0005592767 nova_compute[182623]: 2026-01-22 22:47:54.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:57 np0005592767 nova_compute[182623]: 2026-01-22 22:47:57.554 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:47:58 np0005592767 podman[236382]: 2026-01-22 22:47:58.151922505 +0000 UTC m=+0.071290218 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:47:59 np0005592767 nova_compute[182623]: 2026-01-22 22:47:59.441 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.524 182627 DEBUG nova.compute.manager [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.525 182627 DEBUG nova.compute.manager [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing instance network info cache due to event network-changed-2b7b67c3-2e63-4c82-aef2-a408a460e74b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.525 182627 DEBUG oslo_concurrency.lockutils [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.526 182627 DEBUG oslo_concurrency.lockutils [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.526 182627 DEBUG nova.network.neutron [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Refreshing network info cache for port 2b7b67c3-2e63-4c82-aef2-a408a460e74b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.556 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.598 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.598 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.599 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.599 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.599 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.613 182627 INFO nova.compute.manager [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Terminating instance#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.624 182627 DEBUG nova.compute.manager [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:48:02 np0005592767 kernel: tap2b7b67c3-2e (unregistering): left promiscuous mode
Jan 22 17:48:02 np0005592767 NetworkManager[54973]: <info>  [1769122082.6554] device (tap2b7b67c3-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00710|binding|INFO|Releasing lport 2b7b67c3-2e63-4c82-aef2-a408a460e74b from this chassis (sb_readonly=0)
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00711|binding|INFO|Setting lport 2b7b67c3-2e63-4c82-aef2-a408a460e74b down in Southbound
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.667 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00712|binding|INFO|Removing iface tap2b7b67c3-2e ovn-installed in OVS
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.670 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.676 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:f7:44 10.100.0.4'], port_security=['fa:16:3e:fa:f7:44 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55b79226-17c7-4623-9f19-8585aca1b119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=297b2520-2860-4872-a497-3a3478b0820d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2b7b67c3-2e63-4c82-aef2-a408a460e74b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.679 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2b7b67c3-2e63-4c82-aef2-a408a460e74b in datapath 55b79226-17c7-4623-9f19-8585aca1b119 unbound from our chassis#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.682 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55b79226-17c7-4623-9f19-8585aca1b119, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.684 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[753e0304-7c07-4ab7-8ee0-9b76277a3878]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.685 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 namespace which is not needed anymore#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.696 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 kernel: tapb61acc56-ff (unregistering): left promiscuous mode
Jan 22 17:48:02 np0005592767 NetworkManager[54973]: <info>  [1769122082.7201] device (tapb61acc56-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.728 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00713|binding|INFO|Releasing lport b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 from this chassis (sb_readonly=0)
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00714|binding|INFO|Setting lport b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 down in Southbound
Jan 22 17:48:02 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:02Z|00715|binding|INFO|Removing iface tapb61acc56-ff ovn-installed in OVS
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:02.743 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:2f:14 2001:db8:0:1:f816:3eff:fe1d:2f14 2001:db8::f816:3eff:fe1d:2f14'], port_security=['fa:16:3e:1d:2f:14 2001:db8:0:1:f816:3eff:fe1d:2f14 2001:db8::f816:3eff:fe1d:2f14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe1d:2f14/64 2001:db8::f816:3eff:fe1d:2f14/64', 'neutron:device_id': 'bb1cbc4a-90d9-439b-a2bf-0d4a3b533533', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09b515c7-d044-43d4-b895-408eb5de1fd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79e3b6c9-1702-442a-9d28-07d25f4489ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f24457d1-1f42-46ad-bdaa-d087103c906a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 22 17:48:02 np0005592767 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d0000009f.scope: Consumed 13.411s CPU time.
Jan 22 17:48:02 np0005592767 systemd-machined[153912]: Machine qemu-86-instance-0000009f terminated.
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [NOTICE]   (236157) : haproxy version is 2.8.14-c23fe91
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [NOTICE]   (236157) : path to executable is /usr/sbin/haproxy
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [WARNING]  (236157) : Exiting Master process...
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [WARNING]  (236157) : Exiting Master process...
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [ALERT]    (236157) : Current worker (236159) exited with code 143 (Terminated)
Jan 22 17:48:02 np0005592767 neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119[236153]: [WARNING]  (236157) : All workers exited. Exiting... (0)
Jan 22 17:48:02 np0005592767 systemd[1]: libpod-f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f.scope: Deactivated successfully.
Jan 22 17:48:02 np0005592767 podman[236436]: 2026-01-22 22:48:02.858932848 +0000 UTC m=+0.054383079 container died f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:48:02 np0005592767 NetworkManager[54973]: <info>  [1769122082.8663] manager: (tapb61acc56-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/328)
Jan 22 17:48:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f-userdata-shm.mount: Deactivated successfully.
Jan 22 17:48:02 np0005592767 systemd[1]: var-lib-containers-storage-overlay-12bdf7dde623f40e2d6fb75b32ab206b4842cbd8da74619aaf2608e2c6a810e9-merged.mount: Deactivated successfully.
Jan 22 17:48:02 np0005592767 podman[236436]: 2026-01-22 22:48:02.913300876 +0000 UTC m=+0.108751097 container cleanup f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.919 182627 INFO nova.virt.libvirt.driver [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance destroyed successfully.#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.920 182627 DEBUG nova.objects.instance [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:48:02 np0005592767 systemd[1]: libpod-conmon-f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f.scope: Deactivated successfully.
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.938 182627 DEBUG nova.virt.libvirt.vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:47:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:47:38Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.939 182627 DEBUG nova.network.os_vif_util [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.940 182627 DEBUG nova.network.os_vif_util [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.940 182627 DEBUG os_vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.944 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.944 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7b67c3-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.946 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.949 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.951 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.955 182627 INFO os_vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:f7:44,bridge_name='br-int',has_traffic_filtering=True,id=2b7b67c3-2e63-4c82-aef2-a408a460e74b,network=Network(55b79226-17c7-4623-9f19-8585aca1b119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b7b67c3-2e')#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.956 182627 DEBUG nova.virt.libvirt.vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-856211129',display_name='tempest-TestGettingAddress-server-856211129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-856211129',id=159,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMz8m+p3eaIXCq8y9DamvCQCj03zbxurzfrceVnyqQS4zmTejUfDih+e/FiVysbsOMaQJ1ikELmfakHEaRSE6BGYO4NXDc7XMw8US38gLkGIigc3lyFncZ6os1bS87Yqjw==',key_name='tempest-TestGettingAddress-1641710974',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:47:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-okvfuhxi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:47:38Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=bb1cbc4a-90d9-439b-a2bf-0d4a3b533533,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.957 182627 DEBUG nova.network.os_vif_util [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.958 182627 DEBUG nova.network.os_vif_util [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.958 182627 DEBUG os_vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.960 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb61acc56-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.962 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.966 182627 INFO os_vif [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:2f:14,bridge_name='br-int',has_traffic_filtering=True,id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9,network=Network(09b515c7-d044-43d4-b895-408eb5de1fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb61acc56-ff')#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.967 182627 INFO nova.virt.libvirt.driver [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Deleting instance files /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533_del#033[00m
Jan 22 17:48:02 np0005592767 nova_compute[182623]: 2026-01-22 22:48:02.968 182627 INFO nova.virt.libvirt.driver [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Deletion of /var/lib/nova/instances/bb1cbc4a-90d9-439b-a2bf-0d4a3b533533_del complete#033[00m
Jan 22 17:48:03 np0005592767 podman[236497]: 2026-01-22 22:48:03.002740846 +0000 UTC m=+0.057051575 container remove f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.008 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[afbecab9-3d50-4126-9dd3-403993f2f84d]: (4, ('Thu Jan 22 10:48:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 (f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f)\nf0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f\nThu Jan 22 10:48:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 (f0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f)\nf0578ffd8b60074213204e3eca6bc8cd02f5782ec3739a5ef5bf68fac0cec67f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.011 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[424b8c17-362e-4de3-89d0-39a2ad4bb5ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.012 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55b79226-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:03 np0005592767 kernel: tap55b79226-10: left promiscuous mode
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.015 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.038 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.042 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed04894-3469-4a0c-8176-05f6422680ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.049 182627 DEBUG nova.compute.manager [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-unplugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.053 182627 DEBUG oslo_concurrency.lockutils [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.053 182627 DEBUG oslo_concurrency.lockutils [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.053 182627 DEBUG oslo_concurrency.lockutils [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.053 182627 DEBUG nova.compute.manager [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No waiting events found dispatching network-vif-unplugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.054 182627 DEBUG nova.compute.manager [req-9c6120a3-091b-4d76-a7d6-77586f75bad2 req-754aec96-df10-4e28-958b-8ea862fe70a6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-unplugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.059 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5556f798-63e6-4ff1-80e0-53339cd2ee37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.060 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[780d27bf-0455-4af6-af29-92665bcd3402]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.061 182627 INFO nova.compute.manager [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.062 182627 DEBUG oslo.service.loopingcall [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.063 182627 DEBUG nova.compute.manager [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.063 182627 DEBUG nova.network.neutron [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.089 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[99d63668-a616-43c9-8cad-cd25724f9ad9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565206, 'reachable_time': 16535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236514, 'error': None, 'target': 'ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 systemd[1]: run-netns-ovnmeta\x2d55b79226\x2d17c7\x2d4623\x2d9f19\x2d8585aca1b119.mount: Deactivated successfully.
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.095 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55b79226-17c7-4623-9f19-8585aca1b119 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.097 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[97f30ebb-de1a-423e-8a21-979c20f52fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.098 104135 INFO neutron.agent.ovn.metadata.agent [-] Port b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 in datapath 09b515c7-d044-43d4-b895-408eb5de1fd8 unbound from our chassis#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.100 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09b515c7-d044-43d4-b895-408eb5de1fd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.101 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9288975b-16f7-4e77-a9a6-3c96980e95ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.101 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 namespace which is not needed anymore#033[00m
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [NOTICE]   (236230) : haproxy version is 2.8.14-c23fe91
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [NOTICE]   (236230) : path to executable is /usr/sbin/haproxy
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [WARNING]  (236230) : Exiting Master process...
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [WARNING]  (236230) : Exiting Master process...
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [ALERT]    (236230) : Current worker (236232) exited with code 143 (Terminated)
Jan 22 17:48:03 np0005592767 neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8[236226]: [WARNING]  (236230) : All workers exited. Exiting... (0)
Jan 22 17:48:03 np0005592767 systemd[1]: libpod-822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214.scope: Deactivated successfully.
Jan 22 17:48:03 np0005592767 podman[236532]: 2026-01-22 22:48:03.285304627 +0000 UTC m=+0.071204475 container died 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.302 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214-userdata-shm.mount: Deactivated successfully.
Jan 22 17:48:03 np0005592767 systemd[1]: var-lib-containers-storage-overlay-d50ff3701dcb0b9028a286e33d335870622f2c63c895972bf15521df5c91abc3-merged.mount: Deactivated successfully.
Jan 22 17:48:03 np0005592767 podman[236532]: 2026-01-22 22:48:03.331631898 +0000 UTC m=+0.117531746 container cleanup 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:48:03 np0005592767 systemd[1]: libpod-conmon-822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214.scope: Deactivated successfully.
Jan 22 17:48:03 np0005592767 podman[236562]: 2026-01-22 22:48:03.419487893 +0000 UTC m=+0.053970218 container remove 822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.426 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8beeb951-faec-482d-985c-d2b85bf1005a]: (4, ('Thu Jan 22 10:48:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 (822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214)\n822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214\nThu Jan 22 10:48:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 (822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214)\n822648cf1b531bdb4d003adc729ee2995194d6240a9d5d348631b2e139a32214\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.427 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[00a3c5de-0e65-438f-ad98-fa5b7ee167a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.428 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09b515c7-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 kernel: tap09b515c7-d0: left promiscuous mode
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.488 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.491 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[98dff72c-e0fd-4274-acb7-d63a5447397c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.519 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab91a4c-1db8-4811-8c94-769026f4c60c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.521 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[00db3412-aab1-493d-999b-0eacd6aa1ba5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.529 182627 DEBUG nova.compute.manager [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-unplugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.529 182627 DEBUG oslo_concurrency.lockutils [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.530 182627 DEBUG oslo_concurrency.lockutils [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.530 182627 DEBUG oslo_concurrency.lockutils [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.530 182627 DEBUG nova.compute.manager [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No waiting events found dispatching network-vif-unplugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:48:03 np0005592767 nova_compute[182623]: 2026-01-22 22:48:03.530 182627 DEBUG nova.compute.manager [req-c79629eb-2147-4ee4-a80b-e13a25987814 req-77935b29-bda9-49ac-831c-c2475909dbab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-unplugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.541 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d8eb84-7f18-4aba-87d5-54d89b44bbf0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565302, 'reachable_time': 35265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236577, 'error': None, 'target': 'ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.544 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09b515c7-d044-43d4-b895-408eb5de1fd8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.545 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[9eedcc13-b3d2-4ff2-bb51-1fd4e0326272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:03 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:03.545 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:48:03 np0005592767 systemd[1]: run-netns-ovnmeta\x2d09b515c7\x2dd044\x2d43d4\x2db895\x2d408eb5de1fd8.mount: Deactivated successfully.
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.211 182627 DEBUG nova.network.neutron [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.228 182627 INFO nova.compute.manager [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Took 1.16 seconds to deallocate network for instance.#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.261 182627 DEBUG nova.network.neutron [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updated VIF entry in instance network info cache for port 2b7b67c3-2e63-4c82-aef2-a408a460e74b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.262 182627 DEBUG nova.network.neutron [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Updating instance_info_cache with network_info: [{"id": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "address": "fa:16:3e:fa:f7:44", "network": {"id": "55b79226-17c7-4623-9f19-8585aca1b119", "bridge": "br-int", "label": "tempest-network-smoke--1245745439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b7b67c3-2e", "ovs_interfaceid": "2b7b67c3-2e63-4c82-aef2-a408a460e74b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "address": "fa:16:3e:1d:2f:14", "network": {"id": "09b515c7-d044-43d4-b895-408eb5de1fd8", "bridge": "br-int", "label": "tempest-network-smoke--555345497", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1d:2f14", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb61acc56-ff", "ovs_interfaceid": "b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.294 182627 DEBUG oslo_concurrency.lockutils [req-11d24f48-e2d5-4222-87f2-2c4044503266 req-ed5a4bd7-43bb-48c2-9d59-9f3af620a115 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.308 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.309 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.366 182627 DEBUG nova.compute.provider_tree [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.381 182627 DEBUG nova.scheduler.client.report [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.402 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.425 182627 INFO nova.scheduler.client.report [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance bb1cbc4a-90d9-439b-a2bf-0d4a3b533533#033[00m
Jan 22 17:48:04 np0005592767 nova_compute[182623]: 2026-01-22 22:48:04.491 182627 DEBUG oslo_concurrency.lockutils [None req-5ec9bddf-2ada-4cf0-9922-8e20afeb7dd4 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.893s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.113 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.113 182627 DEBUG oslo_concurrency.lockutils [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.114 182627 DEBUG oslo_concurrency.lockutils [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.114 182627 DEBUG oslo_concurrency.lockutils [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.115 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No waiting events found dispatching network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.115 182627 WARNING nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received unexpected event network-vif-plugged-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.116 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-deleted-b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.116 182627 INFO nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Neutron deleted interface b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.117 182627 DEBUG nova.network.neutron [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.121 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Detach interface failed, port_id=b61acc56-ff7c-4ebc-b32c-d5e1bbec18a9, reason: Instance bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.122 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-deleted-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.123 182627 INFO nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Neutron deleted interface 2b7b67c3-2e63-4c82-aef2-a408a460e74b; detaching it from the instance and deleting it from the info cache#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.123 182627 DEBUG nova.network.neutron [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.126 182627 DEBUG nova.compute.manager [req-12a5dddc-436e-45c7-a0a8-0b8f91d69e84 req-876153b5-85d9-4ec7-bee0-8cd8b246e9d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Detach interface failed, port_id=2b7b67c3-2e63-4c82-aef2-a408a460e74b, reason: Instance bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.622 182627 DEBUG nova.compute.manager [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.622 182627 DEBUG oslo_concurrency.lockutils [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.623 182627 DEBUG oslo_concurrency.lockutils [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.623 182627 DEBUG oslo_concurrency.lockutils [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "bb1cbc4a-90d9-439b-a2bf-0d4a3b533533-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.623 182627 DEBUG nova.compute.manager [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] No waiting events found dispatching network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:48:05 np0005592767 nova_compute[182623]: 2026-01-22 22:48:05.623 182627 WARNING nova.compute.manager [req-4f763882-675f-49a8-a03d-aa29585eb1aa req-03d8b39d-0f05-44f8-a38b-1d859a3838cd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Received unexpected event network-vif-plugged-2b7b67c3-2e63-4c82-aef2-a408a460e74b for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:48:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:06.548 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:48:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:48:07 np0005592767 nova_compute[182623]: 2026-01-22 22:48:07.559 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:07 np0005592767 nova_compute[182623]: 2026-01-22 22:48:07.962 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:10 np0005592767 podman[236578]: 2026-01-22 22:48:10.180415873 +0000 UTC m=+0.096395008 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:48:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:12.119 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:12.120 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:12.120 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:12 np0005592767 nova_compute[182623]: 2026-01-22 22:48:12.605 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:12 np0005592767 nova_compute[182623]: 2026-01-22 22:48:12.964 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:15 np0005592767 podman[236601]: 2026-01-22 22:48:15.169654518 +0000 UTC m=+0.076754232 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, vcs-type=git, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 17:48:15 np0005592767 podman[236600]: 2026-01-22 22:48:15.171520671 +0000 UTC m=+0.091831149 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:48:17 np0005592767 nova_compute[182623]: 2026-01-22 22:48:17.607 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:17 np0005592767 nova_compute[182623]: 2026-01-22 22:48:17.918 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122082.9160726, bb1cbc4a-90d9-439b-a2bf-0d4a3b533533 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:48:17 np0005592767 nova_compute[182623]: 2026-01-22 22:48:17.918 182627 INFO nova.compute.manager [-] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:48:17 np0005592767 nova_compute[182623]: 2026-01-22 22:48:17.935 182627 DEBUG nova.compute.manager [None req-a078ecde-c29d-42ee-bc28-b7df96e1ddd6 - - - - - -] [instance: bb1cbc4a-90d9-439b-a2bf-0d4a3b533533] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:17 np0005592767 nova_compute[182623]: 2026-01-22 22:48:17.966 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:18 np0005592767 nova_compute[182623]: 2026-01-22 22:48:18.241 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:18 np0005592767 nova_compute[182623]: 2026-01-22 22:48:18.431 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.595 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.596 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.614 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.731 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.731 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.741 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.741 182627 INFO nova.compute.claims [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.883 182627 DEBUG nova.compute.provider_tree [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.904 182627 DEBUG nova.scheduler.client.report [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.927 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:20 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.929 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:20.999 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.001 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.021 182627 INFO nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.037 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.148 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.150 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.151 182627 INFO nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Creating image(s)#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.153 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.153 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.155 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.155 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "150fb5c02402acb366211676d7eb421976af7022" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.156 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "150fb5c02402acb366211676d7eb421976af7022" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:21 np0005592767 nova_compute[182623]: 2026-01-22 22:48:21.185 182627 DEBUG nova.policy [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abbb13a7c01949c8b45e4e3263026c12', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.601 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Successfully created port: 29654d82-4828-4fb8-a86a-6d77b3f6ae38 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.631 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.697 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.part --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.699 182627 DEBUG nova.virt.images [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] 010de27c-f8a9-42c9-ab81-c3983d9679a0 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.701 182627 DEBUG nova.privsep.utils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.701 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.part /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:22 np0005592767 nova_compute[182623]: 2026-01-22 22:48:22.990 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.part /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.converted" returned: 0 in 0.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.003 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.064 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.065 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "150fb5c02402acb366211676d7eb421976af7022" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.081 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:23 np0005592767 podman[236657]: 2026-01-22 22:48:23.134254664 +0000 UTC m=+0.053653299 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:48:23 np0005592767 podman[236660]: 2026-01-22 22:48:23.13446054 +0000 UTC m=+0.051530989 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.161 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.162 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "150fb5c02402acb366211676d7eb421976af7022" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.162 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "150fb5c02402acb366211676d7eb421976af7022" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.175 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.236 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Successfully updated port: 29654d82-4828-4fb8-a86a-6d77b3f6ae38 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.257 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.258 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022,backing_fmt=raw /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.278 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.278 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquired lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.278 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.295 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022,backing_fmt=raw /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.297 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "150fb5c02402acb366211676d7eb421976af7022" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.297 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.329 182627 DEBUG nova.compute.manager [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.330 182627 DEBUG nova.compute.manager [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing instance network info cache due to event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.331 182627 DEBUG oslo_concurrency.lockutils [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.362 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.363 182627 DEBUG nova.objects.instance [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'migration_context' on Instance uuid c07ca635-3e21-4a87-919b-1eeca64c5282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.380 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.380 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Ensure instance console log exists: /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.381 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.382 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.382 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:23 np0005592767 nova_compute[182623]: 2026-01-22 22:48:23.458 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.309 182627 DEBUG nova.network.neutron [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.331 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Releasing lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.331 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Instance network_info: |[{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.332 182627 DEBUG oslo_concurrency.lockutils [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.333 182627 DEBUG nova.network.neutron [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.338 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Start _get_guest_xml network_info=[{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='7911fc838ac2ce1239371f163407d229',container_format='bare',created_at=2026-01-22T22:48:13Z,direct_url=<?>,disk_format='qcow2',id=010de27c-f8a9-42c9-ab81-c3983d9679a0,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-996836153',owner='a0876e1a4cab4f9997487dc31953aafd',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T22:48:17Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '010de27c-f8a9-42c9-ab81-c3983d9679a0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.345 182627 WARNING nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.356 182627 DEBUG nova.virt.libvirt.host [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.357 182627 DEBUG nova.virt.libvirt.host [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.364 182627 DEBUG nova.virt.libvirt.host [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.365 182627 DEBUG nova.virt.libvirt.host [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.366 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.366 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='7911fc838ac2ce1239371f163407d229',container_format='bare',created_at=2026-01-22T22:48:13Z,direct_url=<?>,disk_format='qcow2',id=010de27c-f8a9-42c9-ab81-c3983d9679a0,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-996836153',owner='a0876e1a4cab4f9997487dc31953aafd',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T22:48:17Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.367 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.367 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.368 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.368 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.368 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.369 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.369 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.369 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.370 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.371 182627 DEBUG nova.virt.hardware [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.377 182627 DEBUG nova.virt.libvirt.vif [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:48:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1180307597',display_name='tempest-TestSnapshotPattern-server-1180307597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1180307597',id=163,image_ref='010de27c-f8a9-42c9-ab81-c3983d9679a0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-2vnizph1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='cb8ac1a6-ad25-4019-add5-64c347b769cb',image_min_disk='1',image_min_ram='0',image_owner_id='a0876e1a4cab4f9997487dc31953aafd',image_owner_project_name='tempest-TestSnapshotPattern-1578752051',image_owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member',image_user_id='abbb13a7c01949c8b45e4e3263026c12',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:48:21Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=c07ca635-3e21-4a87-919b-1eeca64c5282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.377 182627 DEBUG nova.network.os_vif_util [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.378 182627 DEBUG nova.network.os_vif_util [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.379 182627 DEBUG nova.objects.instance [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'pci_devices' on Instance uuid c07ca635-3e21-4a87-919b-1eeca64c5282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.394 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <uuid>c07ca635-3e21-4a87-919b-1eeca64c5282</uuid>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <name>instance-000000a3</name>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestSnapshotPattern-server-1180307597</nova:name>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:48:24</nova:creationTime>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:user uuid="abbb13a7c01949c8b45e4e3263026c12">tempest-TestSnapshotPattern-1578752051-project-member</nova:user>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:project uuid="a0876e1a4cab4f9997487dc31953aafd">tempest-TestSnapshotPattern-1578752051</nova:project>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="010de27c-f8a9-42c9-ab81-c3983d9679a0"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        <nova:port uuid="29654d82-4828-4fb8-a86a-6d77b3f6ae38">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="serial">c07ca635-3e21-4a87-919b-1eeca64c5282</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="uuid">c07ca635-3e21-4a87-919b-1eeca64c5282</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.config"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:55:94:3d"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <target dev="tap29654d82-48"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/console.log" append="off"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <input type="keyboard" bus="usb"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:48:24 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:48:24 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:48:24 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:48:24 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.395 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Preparing to wait for external event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.396 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.396 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.397 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.398 182627 DEBUG nova.virt.libvirt.vif [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:48:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1180307597',display_name='tempest-TestSnapshotPattern-server-1180307597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1180307597',id=163,image_ref='010de27c-f8a9-42c9-ab81-c3983d9679a0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-2vnizph1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='cb8ac1a6-ad25-4019-add5-64c347b769cb',image_min_disk='1',image_min_ram='0',image_owner_id='a0876e1a4cab4f9997487dc31953aafd',image_owner_project_name='tempest-TestSnapshotPattern-1578752051',image_owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member',image_user_id='abbb13a7c01949c8b45e4e3263026c12',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:48:21Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=c07ca635-3e21-4a87-919b-1eeca64c5282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.398 182627 DEBUG nova.network.os_vif_util [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.399 182627 DEBUG nova.network.os_vif_util [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.400 182627 DEBUG os_vif [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.401 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.402 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.402 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.406 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.406 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29654d82-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.408 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29654d82-48, col_values=(('external_ids', {'iface-id': '29654d82-4828-4fb8-a86a-6d77b3f6ae38', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:94:3d', 'vm-uuid': 'c07ca635-3e21-4a87-919b-1eeca64c5282'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.410 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:24 np0005592767 NetworkManager[54973]: <info>  [1769122104.4115] manager: (tap29654d82-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.415 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.419 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.420 182627 INFO os_vif [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48')#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.482 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.483 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.483 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] No VIF found with MAC fa:16:3e:55:94:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.483 182627 INFO nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Using config drive#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.949 182627 INFO nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Creating config drive at /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.config#033[00m
Jan 22 17:48:24 np0005592767 nova_compute[182623]: 2026-01-22 22:48:24.956 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw6qdff33 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.084 182627 DEBUG oslo_concurrency.processutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw6qdff33" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:25 np0005592767 kernel: tap29654d82-48: entered promiscuous mode
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.1465] manager: (tap29654d82-48): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Jan 22 17:48:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:25Z|00716|binding|INFO|Claiming lport 29654d82-4828-4fb8-a86a-6d77b3f6ae38 for this chassis.
Jan 22 17:48:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:25Z|00717|binding|INFO|29654d82-4828-4fb8-a86a-6d77b3f6ae38: Claiming fa:16:3e:55:94:3d 10.100.0.11
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.147 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.160 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.1637] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.1648] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.166 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:94:3d 10.100.0.11'], port_security=['fa:16:3e:55:94:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c07ca635-3e21-4a87-919b-1eeca64c5282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f14033-82f9-4533-a194-36532baa893b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '039e09b7-4927-4c69-bb9d-1012bf4a1d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c5e7990-8af4-4ab4-b8e4-c75ffda3dd74, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=29654d82-4828-4fb8-a86a-6d77b3f6ae38) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.167 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 29654d82-4828-4fb8-a86a-6d77b3f6ae38 in datapath f3f14033-82f9-4533-a194-36532baa893b bound to our chassis#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.169 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f3f14033-82f9-4533-a194-36532baa893b#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.181 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[88371dd6-10f6-4f74-91a1-3b2efdafc6b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.181 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf3f14033-81 in ovnmeta-f3f14033-82f9-4533-a194-36532baa893b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.184 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf3f14033-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.184 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ac24b04e-4927-46a8-be45-3b12a5cdb74b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.185 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae9a586-a266-4bf5-b499-f982019171f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 systemd-udevd[236732]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:48:25 np0005592767 systemd-machined[153912]: New machine qemu-87-instance-000000a3.
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.2029] device (tap29654d82-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.2036] device (tap29654d82-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.202 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8dbf5e-d222-4a93-97d8-2b044ddc3bfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.237 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[58a4002a-01bb-4283-8664-817a50b03c94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.275 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[37af36c4-0a6e-49b3-9668-8ae6251be47a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.292 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c1bae41c-39cc-451e-bcf7-01ae61674111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.2937] manager: (tapf3f14033-80): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 22 17:48:25 np0005592767 systemd[1]: Started Virtual Machine qemu-87-instance-000000a3.
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.317 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.334 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[a65c1b56-c77f-4a48-b25e-be0aa494c8eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.336 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c0263ff1-ad5b-40c6-ba99-9ee89fe7234c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.339 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:25Z|00718|binding|INFO|Setting lport 29654d82-4828-4fb8-a86a-6d77b3f6ae38 ovn-installed in OVS
Jan 22 17:48:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:25Z|00719|binding|INFO|Setting lport 29654d82-4828-4fb8-a86a-6d77b3f6ae38 up in Southbound
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.349 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.3583] device (tapf3f14033-80): carrier: link connected
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.366 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[60a855c6-3e76-41af-beec-96f76226bce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.387 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b3688c4c-2ecc-4307-a43f-b8411d959682]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f14033-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:30:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570193, 'reachable_time': 16834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236764, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.407 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[abfd15a8-1265-4ba5-9e17-7b22a3389915]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:3086'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570193, 'tstamp': 570193}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236766, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.426 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[82e6675f-1a5a-4ec0-9877-e727ee889b49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f14033-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:30:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570193, 'reachable_time': 16834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236767, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.457 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5c63a6-6bf2-456e-a39d-0865750e6a71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.520 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5d86e55e-648d-440e-baf5-e6146bd4e32b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.521 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f14033-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.522 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.522 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3f14033-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.524 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 NetworkManager[54973]: <info>  [1769122105.5257] manager: (tapf3f14033-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 22 17:48:25 np0005592767 kernel: tapf3f14033-80: entered promiscuous mode
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.531 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf3f14033-80, col_values=(('external_ids', {'iface-id': '941474a6-10cc-4642-b048-e5e47f4d8a09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.532 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:25Z|00720|binding|INFO|Releasing lport 941474a6-10cc-4642-b048-e5e47f4d8a09 from this chassis (sb_readonly=0)
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.562 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.563 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[80649ed9-8b06-4e17-8265-0531d2f12161]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.564 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-f3f14033-82f9-4533-a194-36532baa893b
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/f3f14033-82f9-4533-a194-36532baa893b.pid.haproxy
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID f3f14033-82f9-4533-a194-36532baa893b
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:48:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:25.565 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'env', 'PROCESS_TAG=haproxy-f3f14033-82f9-4533-a194-36532baa893b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f3f14033-82f9-4533-a194-36532baa893b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.756 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122105.7565122, c07ca635-3e21-4a87-919b-1eeca64c5282 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.757 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] VM Started (Lifecycle Event)#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.779 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.784 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122105.757306, c07ca635-3e21-4a87-919b-1eeca64c5282 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.785 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.807 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.810 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.830 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.984 182627 DEBUG nova.compute.manager [req-bf837f7d-1a02-4a4d-b2a7-0369aac7a6d0 req-4caa8193-5799-4fee-802c-fe8e338e7f15 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.986 182627 DEBUG oslo_concurrency.lockutils [req-bf837f7d-1a02-4a4d-b2a7-0369aac7a6d0 req-4caa8193-5799-4fee-802c-fe8e338e7f15 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:25 np0005592767 podman[236806]: 2026-01-22 22:48:25.986524893 +0000 UTC m=+0.061837340 container create 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.987 182627 DEBUG oslo_concurrency.lockutils [req-bf837f7d-1a02-4a4d-b2a7-0369aac7a6d0 req-4caa8193-5799-4fee-802c-fe8e338e7f15 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.989 182627 DEBUG oslo_concurrency.lockutils [req-bf837f7d-1a02-4a4d-b2a7-0369aac7a6d0 req-4caa8193-5799-4fee-802c-fe8e338e7f15 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.989 182627 DEBUG nova.compute.manager [req-bf837f7d-1a02-4a4d-b2a7-0369aac7a6d0 req-4caa8193-5799-4fee-802c-fe8e338e7f15 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Processing event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:48:25 np0005592767 nova_compute[182623]: 2026-01-22 22:48:25.992 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.000 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122106.0003908, c07ca635-3e21-4a87-919b-1eeca64c5282 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.002 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.008 182627 DEBUG nova.network.neutron [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updated VIF entry in instance network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.010 182627 DEBUG nova.network.neutron [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.012 182627 DEBUG nova.virt.libvirt.driver [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.016 182627 INFO nova.virt.libvirt.driver [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Instance spawned successfully.#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.017 182627 INFO nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Took 4.87 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.017 182627 DEBUG nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.025 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.028 182627 DEBUG oslo_concurrency.lockutils [req-98bc823a-d69c-492b-8a20-319831c76e79 req-e65e7af9-459f-4c2c-820b-75cb13e4a86c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.031 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:48:26 np0005592767 systemd[1]: Started libpod-conmon-76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231.scope.
Jan 22 17:48:26 np0005592767 podman[236806]: 2026-01-22 22:48:25.947560031 +0000 UTC m=+0.022872538 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.054 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:48:26 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:48:26 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27dac8baa3077191f40e0145e952fb8780ad12d11319a10dc6ac102e3bdfb999/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:48:26 np0005592767 podman[236806]: 2026-01-22 22:48:26.103768039 +0000 UTC m=+0.179080626 container init 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 17:48:26 np0005592767 podman[236806]: 2026-01-22 22:48:26.108531934 +0000 UTC m=+0.183844391 container start 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.108 182627 INFO nova.compute.manager [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Took 5.43 seconds to build instance.#033[00m
Jan 22 17:48:26 np0005592767 nova_compute[182623]: 2026-01-22 22:48:26.126 182627 DEBUG oslo_concurrency.lockutils [None req-f93a20d5-2626-464e-8e5e-95a56f417910 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:26 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [NOTICE]   (236825) : New worker (236827) forked
Jan 22 17:48:26 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [NOTICE]   (236825) : Loading success.
Jan 22 17:48:27 np0005592767 nova_compute[182623]: 2026-01-22 22:48:27.612 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.275 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.326 182627 DEBUG nova.compute.manager [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.327 182627 DEBUG oslo_concurrency.lockutils [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.327 182627 DEBUG oslo_concurrency.lockutils [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.327 182627 DEBUG oslo_concurrency.lockutils [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.328 182627 DEBUG nova.compute.manager [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] No waiting events found dispatching network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:48:28 np0005592767 nova_compute[182623]: 2026-01-22 22:48:28.328 182627 WARNING nova.compute.manager [req-668a8d98-a576-4473-bc70-e32dc9075ec9 req-5d5e9b09-c8ab-4b44-b0b4-51415d9263bc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received unexpected event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:48:29 np0005592767 podman[236836]: 2026-01-22 22:48:29.15688733 +0000 UTC m=+0.061924623 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:48:29 np0005592767 nova_compute[182623]: 2026-01-22 22:48:29.411 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:29 np0005592767 nova_compute[182623]: 2026-01-22 22:48:29.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:30 np0005592767 nova_compute[182623]: 2026-01-22 22:48:30.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:31 np0005592767 nova_compute[182623]: 2026-01-22 22:48:31.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:32 np0005592767 nova_compute[182623]: 2026-01-22 22:48:32.661 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:32 np0005592767 nova_compute[182623]: 2026-01-22 22:48:32.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:32 np0005592767 nova_compute[182623]: 2026-01-22 22:48:32.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:48:32 np0005592767 nova_compute[182623]: 2026-01-22 22:48:32.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:48:33 np0005592767 nova_compute[182623]: 2026-01-22 22:48:33.538 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:33 np0005592767 nova_compute[182623]: 2026-01-22 22:48:33.538 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:33 np0005592767 nova_compute[182623]: 2026-01-22 22:48:33.539 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:48:33 np0005592767 nova_compute[182623]: 2026-01-22 22:48:33.539 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c07ca635-3e21-4a87-919b-1eeca64c5282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:48:34 np0005592767 nova_compute[182623]: 2026-01-22 22:48:34.416 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:35 np0005592767 nova_compute[182623]: 2026-01-22 22:48:35.654 182627 DEBUG nova.compute.manager [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:35 np0005592767 nova_compute[182623]: 2026-01-22 22:48:35.655 182627 DEBUG nova.compute.manager [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing instance network info cache due to event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:48:35 np0005592767 nova_compute[182623]: 2026-01-22 22:48:35.656 182627 DEBUG oslo_concurrency.lockutils [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:37 np0005592767 nova_compute[182623]: 2026-01-22 22:48:37.663 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:37Z|00072|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.8 does not match offer 10.100.0.11
Jan 22 17:48:37 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:37Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:55:94:3d 10.100.0.11
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.206 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.233 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.234 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.234 182627 DEBUG oslo_concurrency.lockutils [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.234 182627 DEBUG nova.network.neutron [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.236 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.237 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.258 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.259 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.259 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.260 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.348 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.435 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.437 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.493 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.716 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.718 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5510MB free_disk=73.05057525634766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.719 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.719 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.894 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance c07ca635-3e21-4a87-919b-1eeca64c5282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.895 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.896 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.949 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:48:38 np0005592767 nova_compute[182623]: 2026-01-22 22:48:38.965 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:48:39 np0005592767 nova_compute[182623]: 2026-01-22 22:48:39.010 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:48:39 np0005592767 nova_compute[182623]: 2026-01-22 22:48:39.011 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:39 np0005592767 nova_compute[182623]: 2026-01-22 22:48:39.420 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:40 np0005592767 nova_compute[182623]: 2026-01-22 22:48:40.671 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:40 np0005592767 nova_compute[182623]: 2026-01-22 22:48:40.671 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:40 np0005592767 nova_compute[182623]: 2026-01-22 22:48:40.672 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:48:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:41Z|00074|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.8 does not match offer 10.100.0.11
Jan 22 17:48:41 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:41Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:55:94:3d 10.100.0.11
Jan 22 17:48:41 np0005592767 podman[236883]: 2026-01-22 22:48:41.186239432 +0000 UTC m=+0.097511720 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:48:42 np0005592767 nova_compute[182623]: 2026-01-22 22:48:42.714 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:42Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:94:3d 10.100.0.11
Jan 22 17:48:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:42Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:94:3d 10.100.0.11
Jan 22 17:48:43 np0005592767 nova_compute[182623]: 2026-01-22 22:48:43.107 182627 DEBUG nova.network.neutron [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updated VIF entry in instance network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:48:43 np0005592767 nova_compute[182623]: 2026-01-22 22:48:43.108 182627 DEBUG nova.network.neutron [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:48:43 np0005592767 nova_compute[182623]: 2026-01-22 22:48:43.185 182627 DEBUG oslo_concurrency.lockutils [req-3e470a7d-9a1f-4924-8851-fd4f530eef8d req-8390ac3c-2754-4fa5-8261-20627962dfe8 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:48:43 np0005592767 nova_compute[182623]: 2026-01-22 22:48:43.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:44 np0005592767 nova_compute[182623]: 2026-01-22 22:48:44.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:45 np0005592767 nova_compute[182623]: 2026-01-22 22:48:45.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:46 np0005592767 podman[236905]: 2026-01-22 22:48:46.179576513 +0000 UTC m=+0.068760595 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 17:48:46 np0005592767 podman[236904]: 2026-01-22 22:48:46.205233599 +0000 UTC m=+0.094425861 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:48:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:47.340 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:66:79 2001:db8:0:1:f816:3eff:fe33:6679 2001:db8::f816:3eff:fe33:6679'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe33:6679/64 2001:db8::f816:3eff:fe33:6679/64', 'neutron:device_id': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30651656-9209-4f2c-a0e4-55fbbfbf46e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=58916e1d-6812-4cb1-a469-e8a2b6c851b7) old=Port_Binding(mac=['fa:16:3e:33:66:79 2001:db8::f816:3eff:fe33:6679'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe33:6679/64', 'neutron:device_id': 'ovnmeta-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b0b7be0-dc91-4e0d-bd73-07331822edfa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:47.342 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 58916e1d-6812-4cb1-a469-e8a2b6c851b7 in datapath 7b0b7be0-dc91-4e0d-bd73-07331822edfa updated#033[00m
Jan 22 17:48:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:47.344 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b0b7be0-dc91-4e0d-bd73-07331822edfa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:48:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:47.346 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b87c9c17-03b1-4a12-91a4-cf975e5c3d87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:47Z|00721|binding|INFO|Releasing lport 941474a6-10cc-4642-b048-e5e47f4d8a09 from this chassis (sb_readonly=0)
Jan 22 17:48:47 np0005592767 nova_compute[182623]: 2026-01-22 22:48:47.611 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:47 np0005592767 nova_compute[182623]: 2026-01-22 22:48:47.716 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:49 np0005592767 nova_compute[182623]: 2026-01-22 22:48:49.426 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:50 np0005592767 nova_compute[182623]: 2026-01-22 22:48:50.196 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.095 182627 DEBUG nova.compute.manager [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.175 182627 INFO nova.compute.manager [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] instance snapshotting#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.497 182627 INFO nova.virt.libvirt.driver [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Beginning live snapshot process#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.719 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:52 np0005592767 virtqemud[182095]: invalid argument: disk vda does not have an active block job
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.731 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.821 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json -f qcow2" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.823 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.924 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json -f qcow2" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.938 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.991 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:52 np0005592767 nova_compute[182623]: 2026-01-22 22:48:52.993 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.038 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.039 182627 INFO nova.virt.libvirt.driver [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.082 182627 DEBUG nova.virt.libvirt.guest [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] COPY block job progress, current cursor: 0 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.586 182627 DEBUG nova.virt.libvirt.guest [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] COPY block job progress, current cursor: 1048576 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.591 182627 INFO nova.virt.libvirt.driver [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.642 182627 DEBUG nova.privsep.utils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 22 17:48:53 np0005592767 nova_compute[182623]: 2026-01-22 22:48:53.643 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189.delta /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:54 np0005592767 nova_compute[182623]: 2026-01-22 22:48:54.086 182627 DEBUG oslo_concurrency.processutils [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189.delta /var/lib/nova/instances/snapshots/tmpbwc7j9lt/ee55aca7760d4f10bf3932e46c63f189" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:54 np0005592767 nova_compute[182623]: 2026-01-22 22:48:54.087 182627 INFO nova.virt.libvirt.driver [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Snapshot extracted, beginning image upload#033[00m
Jan 22 17:48:54 np0005592767 podman[236979]: 2026-01-22 22:48:54.1505087 +0000 UTC m=+0.062267172 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 17:48:54 np0005592767 podman[236980]: 2026-01-22 22:48:54.151168179 +0000 UTC m=+0.060264676 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:48:54 np0005592767 nova_compute[182623]: 2026-01-22 22:48:54.429 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:55 np0005592767 nova_compute[182623]: 2026-01-22 22:48:55.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:56 np0005592767 nova_compute[182623]: 2026-01-22 22:48:56.950 182627 INFO nova.virt.libvirt.driver [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Snapshot image upload complete#033[00m
Jan 22 17:48:56 np0005592767 nova_compute[182623]: 2026-01-22 22:48:56.951 182627 INFO nova.compute.manager [None req-1a9da74f-d67f-4141-8bb5-dc143b9d3791 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Took 4.76 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 22 17:48:57 np0005592767 nova_compute[182623]: 2026-01-22 22:48:57.721 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.898 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.900 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.900 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.900 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.900 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.901 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.927 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.948 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.949 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Image id 010de27c-f8a9-42c9-ab81-c3983d9679a0 yields fingerprint 150fb5c02402acb366211676d7eb421976af7022 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.949 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] image 010de27c-f8a9-42c9-ab81-c3983d9679a0 at (/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022): checking#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.950 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] image 010de27c-f8a9-42c9-ab81-c3983d9679a0 at (/var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.952 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.953 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] c07ca635-3e21-4a87-919b-1eeca64c5282 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.954 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] c07ca635-3e21-4a87-919b-1eeca64c5282 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m
Jan 22 17:48:58 np0005592767 nova_compute[182623]: 2026-01-22 22:48:58.954 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.050 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.050 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance c07ca635-3e21-4a87-919b-1eeca64c5282 is backed by 150fb5c02402acb366211676d7eb421976af7022 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.051 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.051 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.051 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.051 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.052 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.052 182627 WARNING nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.052 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Active base files: /var/lib/nova/instances/_base/150fb5c02402acb366211676d7eb421976af7022#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.052 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Removable base files: /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32 /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643 /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5 /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.053 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.053 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b3ed4f0e9759b70eca9697bcfc4a50c46b79fe32#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.053 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/cf6fcf8095e0aa4b9a2b46cf2ceae27d83cb3e3c#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.053 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ba85555e7564e6a234e110f556e0425220bc4643#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.053 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/7c9dda9354a1b10fc44c169b7a889804e407fad5#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.054 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bfc2a2b0a125b37685b6254f144e511a6d354259#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.054 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.054 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.054 182627 DEBUG nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.055 182627 INFO nova.virt.libvirt.imagecache [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.059 182627 DEBUG nova.compute.manager [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.059 182627 DEBUG nova.compute.manager [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing instance network info cache due to event network-changed-29654d82-4828-4fb8-a86a-6d77b3f6ae38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.060 182627 DEBUG oslo_concurrency.lockutils [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.060 182627 DEBUG oslo_concurrency.lockutils [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.060 182627 DEBUG nova.network.neutron [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Refreshing network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.211 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.212 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.212 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.213 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.216 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.232 182627 INFO nova.compute.manager [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Terminating instance#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.253 182627 DEBUG nova.compute.manager [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:48:59 np0005592767 kernel: tap29654d82-48 (unregistering): left promiscuous mode
Jan 22 17:48:59 np0005592767 NetworkManager[54973]: <info>  [1769122139.2873] device (tap29654d82-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.293 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:59Z|00722|binding|INFO|Releasing lport 29654d82-4828-4fb8-a86a-6d77b3f6ae38 from this chassis (sb_readonly=0)
Jan 22 17:48:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:59Z|00723|binding|INFO|Setting lport 29654d82-4828-4fb8-a86a-6d77b3f6ae38 down in Southbound
Jan 22 17:48:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:48:59Z|00724|binding|INFO|Removing iface tap29654d82-48 ovn-installed in OVS
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.298 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.316 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:94:3d 10.100.0.11'], port_security=['fa:16:3e:55:94:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c07ca635-3e21-4a87-919b-1eeca64c5282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f14033-82f9-4533-a194-36532baa893b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0876e1a4cab4f9997487dc31953aafd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '039e09b7-4927-4c69-bb9d-1012bf4a1d89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c5e7990-8af4-4ab4-b8e4-c75ffda3dd74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=29654d82-4828-4fb8-a86a-6d77b3f6ae38) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.317 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 29654d82-4828-4fb8-a86a-6d77b3f6ae38 in datapath f3f14033-82f9-4533-a194-36532baa893b unbound from our chassis#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.318 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3f14033-82f9-4533-a194-36532baa893b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.320 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.319 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ce143dd6-29ed-494d-8537-9be8c7674cff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.320 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f3f14033-82f9-4533-a194-36532baa893b namespace which is not needed anymore#033[00m
Jan 22 17:48:59 np0005592767 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 22 17:48:59 np0005592767 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a3.scope: Consumed 13.498s CPU time.
Jan 22 17:48:59 np0005592767 systemd-machined[153912]: Machine qemu-87-instance-000000a3 terminated.
Jan 22 17:48:59 np0005592767 podman[237026]: 2026-01-22 22:48:59.403871966 +0000 UTC m=+0.064076224 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [NOTICE]   (236825) : haproxy version is 2.8.14-c23fe91
Jan 22 17:48:59 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [NOTICE]   (236825) : path to executable is /usr/sbin/haproxy
Jan 22 17:48:59 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [WARNING]  (236825) : Exiting Master process...
Jan 22 17:48:59 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [ALERT]    (236825) : Current worker (236827) exited with code 143 (Terminated)
Jan 22 17:48:59 np0005592767 neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b[236821]: [WARNING]  (236825) : All workers exited. Exiting... (0)
Jan 22 17:48:59 np0005592767 systemd[1]: libpod-76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231.scope: Deactivated successfully.
Jan 22 17:48:59 np0005592767 podman[237071]: 2026-01-22 22:48:59.474484173 +0000 UTC m=+0.049225313 container died 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.475 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.479 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231-userdata-shm.mount: Deactivated successfully.
Jan 22 17:48:59 np0005592767 systemd[1]: var-lib-containers-storage-overlay-27dac8baa3077191f40e0145e952fb8780ad12d11319a10dc6ac102e3bdfb999-merged.mount: Deactivated successfully.
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.515 182627 INFO nova.virt.libvirt.driver [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Instance destroyed successfully.#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.516 182627 DEBUG nova.objects.instance [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lazy-loading 'resources' on Instance uuid c07ca635-3e21-4a87-919b-1eeca64c5282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:48:59 np0005592767 podman[237071]: 2026-01-22 22:48:59.516517992 +0000 UTC m=+0.091259112 container cleanup 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:48:59 np0005592767 systemd[1]: libpod-conmon-76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231.scope: Deactivated successfully.
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.535 182627 DEBUG nova.virt.libvirt.vif [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:48:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1180307597',display_name='tempest-TestSnapshotPattern-server-1180307597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1180307597',id=163,image_ref='010de27c-f8a9-42c9-ab81-c3983d9679a0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKRyYAeVMNtC/j+MZtGGRG3eEEuekA15beqoQuOmHPR2UVmKb27cRtpJBpme1vKuXPPT5TKSpNW135l4FCwnjJAiPROKVyFsz2cyOtDZC0vbf3qqtMTnPoOqjT3eszeS0g==',key_name='tempest-TestSnapshotPattern-592124350',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:48:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a0876e1a4cab4f9997487dc31953aafd',ramdisk_id='',reservation_id='r-2vnizph1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='cb8ac1a6-ad25-4019-add5-64c347b769cb',image_min_disk='1',image_min_ram='0',image_owner_id='a0876e1a4cab4f9997487dc31953aafd',image_owner_project_name='tempest-TestSnapshotPattern-1578752051',image_owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member',image_user_id='abbb13a7c01949c8b45e4e3263026c12',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-1578752051',owner_user_name='tempest-TestSnapshotPattern-1578752051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:48:56Z,user_data=None,user_id='abbb13a7c01949c8b45e4e3263026c12',uuid=c07ca635-3e21-4a87-919b-1eeca64c5282,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.536 182627 DEBUG nova.network.os_vif_util [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converting VIF {"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.536 182627 DEBUG nova.network.os_vif_util [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.537 182627 DEBUG os_vif [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.539 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.539 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29654d82-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.541 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.542 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.546 182627 INFO os_vif [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:94:3d,bridge_name='br-int',has_traffic_filtering=True,id=29654d82-4828-4fb8-a86a-6d77b3f6ae38,network=Network(f3f14033-82f9-4533-a194-36532baa893b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29654d82-48')#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.546 182627 INFO nova.virt.libvirt.driver [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Deleting instance files /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282_del#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.547 182627 INFO nova.virt.libvirt.driver [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Deletion of /var/lib/nova/instances/c07ca635-3e21-4a87-919b-1eeca64c5282_del complete#033[00m
Jan 22 17:48:59 np0005592767 podman[237117]: 2026-01-22 22:48:59.594954881 +0000 UTC m=+0.051164408 container remove 76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.600 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c9713717-40f8-4af8-a80f-77dc72657a5e]: (4, ('Thu Jan 22 10:48:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b (76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231)\n76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231\nThu Jan 22 10:48:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f3f14033-82f9-4533-a194-36532baa893b (76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231)\n76a712b1966049542f3de32f1a621e46249e07d683e59a433348766344317231\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.601 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[72055de8-5f41-45d0-8517-29f2e186f0a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.602 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f14033-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 kernel: tapf3f14033-80: left promiscuous mode
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.615 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.618 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c17dc65c-697e-4c60-83fe-4d8b2a893d26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.641 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3601f158-872f-4e08-aa9f-cd27b449bcc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.643 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[55c55cd6-da62-4672-84c7-7a1d869b7529]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.647 182627 INFO nova.compute.manager [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.647 182627 DEBUG oslo.service.loopingcall [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.648 182627 DEBUG nova.compute.manager [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:48:59 np0005592767 nova_compute[182623]: 2026-01-22 22:48:59.648 182627 DEBUG nova.network.neutron [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.662 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[82f9f842-66af-431c-a370-8aec07612387]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570184, 'reachable_time': 28190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237132, 'error': None, 'target': 'ovnmeta-f3f14033-82f9-4533-a194-36532baa893b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:48:59 np0005592767 systemd[1]: run-netns-ovnmeta\x2df3f14033\x2d82f9\x2d4533\x2da194\x2d36532baa893b.mount: Deactivated successfully.
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.668 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f3f14033-82f9-4533-a194-36532baa893b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:48:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:48:59.670 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cb67d3-53c0-42ff-9987-0a5ed41f93e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.889 182627 DEBUG nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-unplugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.889 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.890 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.891 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.891 182627 DEBUG nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] No waiting events found dispatching network-vif-unplugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.892 182627 DEBUG nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-unplugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.893 182627 DEBUG nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.893 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.894 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.894 182627 DEBUG oslo_concurrency.lockutils [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.895 182627 DEBUG nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] No waiting events found dispatching network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:49:00 np0005592767 nova_compute[182623]: 2026-01-22 22:49:00.895 182627 WARNING nova.compute.manager [req-c920f295-64c2-4aff-a667-a321a76fa690 req-6b81ff6f-cb5e-48de-b5e1-d69d95fd4d96 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received unexpected event network-vif-plugged-29654d82-4828-4fb8-a86a-6d77b3f6ae38 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.006 182627 DEBUG nova.network.neutron [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updated VIF entry in instance network info cache for port 29654d82-4828-4fb8-a86a-6d77b3f6ae38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.007 182627 DEBUG nova.network.neutron [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [{"id": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "address": "fa:16:3e:55:94:3d", "network": {"id": "f3f14033-82f9-4533-a194-36532baa893b", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1010598589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0876e1a4cab4f9997487dc31953aafd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29654d82-48", "ovs_interfaceid": "29654d82-4828-4fb8-a86a-6d77b3f6ae38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.027 182627 DEBUG oslo_concurrency.lockutils [req-b7f53cf1-95eb-41df-991f-fd8e2732d898 req-0d94c47c-8212-492f-b436-ac64f0981f01 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c07ca635-3e21-4a87-919b-1eeca64c5282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.300 182627 DEBUG nova.network.neutron [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.325 182627 INFO nova.compute.manager [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Took 1.68 seconds to deallocate network for instance.#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.415 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.416 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.454 182627 DEBUG nova.compute.manager [req-dccd05d3-1947-4746-84a6-dccdafe6801e req-210838e5-89c4-42c6-addd-4a29b29248b6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Received event network-vif-deleted-29654d82-4828-4fb8-a86a-6d77b3f6ae38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.478 182627 DEBUG nova.compute.provider_tree [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.494 182627 DEBUG nova.scheduler.client.report [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.521 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.556 182627 INFO nova.scheduler.client.report [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Deleted allocations for instance c07ca635-3e21-4a87-919b-1eeca64c5282#033[00m
Jan 22 17:49:01 np0005592767 nova_compute[182623]: 2026-01-22 22:49:01.656 182627 DEBUG oslo_concurrency.lockutils [None req-bb826623-d0ca-4972-9161-d283148d6739 abbb13a7c01949c8b45e4e3263026c12 a0876e1a4cab4f9997487dc31953aafd - - default default] Lock "c07ca635-3e21-4a87-919b-1eeca64c5282" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.027 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.027 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.044 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.120 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.121 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.127 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.128 182627 INFO nova.compute.claims [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.257 182627 DEBUG nova.compute.provider_tree [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.270 182627 DEBUG nova.scheduler.client.report [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.296 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.296 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.397 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.397 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.413 182627 INFO nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.440 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.561 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.562 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.562 182627 INFO nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Creating image(s)#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.563 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.563 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.564 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.575 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.659 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.660 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.661 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.671 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.722 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.725 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.726 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.760 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.761 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.761 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.823 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.825 182627 DEBUG nova.virt.disk.api [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Checking if we can resize image /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.825 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.902 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.904 182627 DEBUG nova.virt.disk.api [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Cannot resize image /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.905 182627 DEBUG nova.objects.instance [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'migration_context' on Instance uuid b1398aa0-c60b-4f85-8470-bf1860e92421 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.920 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.920 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Ensure instance console log exists: /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.921 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.922 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:02 np0005592767 nova_compute[182623]: 2026-01-22 22:49:02.923 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:03 np0005592767 nova_compute[182623]: 2026-01-22 22:49:03.017 182627 DEBUG nova.policy [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6d72b45b07b4237a9bb58e93cc801f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffd58948cb444c25ae034a02c0344de7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:49:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:04.392 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:49:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:04.393 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:49:04 np0005592767 nova_compute[182623]: 2026-01-22 22:49:04.394 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:04.394 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:04 np0005592767 nova_compute[182623]: 2026-01-22 22:49:04.541 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:04 np0005592767 nova_compute[182623]: 2026-01-22 22:49:04.720 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Successfully created port: cd7bb548-040e-4490-b6ad-99fce3c24eed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:49:06 np0005592767 nova_compute[182623]: 2026-01-22 22:49:06.951 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Successfully updated port: cd7bb548-040e-4490-b6ad-99fce3c24eed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:49:06 np0005592767 nova_compute[182623]: 2026-01-22 22:49:06.976 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:49:06 np0005592767 nova_compute[182623]: 2026-01-22 22:49:06.976 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquired lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:49:06 np0005592767 nova_compute[182623]: 2026-01-22 22:49:06.977 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:49:07 np0005592767 nova_compute[182623]: 2026-01-22 22:49:07.112 182627 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:07 np0005592767 nova_compute[182623]: 2026-01-22 22:49:07.112 182627 DEBUG nova.compute.manager [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing instance network info cache due to event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:49:07 np0005592767 nova_compute[182623]: 2026-01-22 22:49:07.113 182627 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:49:07 np0005592767 nova_compute[182623]: 2026-01-22 22:49:07.207 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:49:07 np0005592767 nova_compute[182623]: 2026-01-22 22:49:07.724 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.144 182627 DEBUG nova.network.neutron [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updating instance_info_cache with network_info: [{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.197 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Releasing lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.198 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance network_info: |[{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.198 182627 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.199 182627 DEBUG nova.network.neutron [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing network info cache for port cd7bb548-040e-4490-b6ad-99fce3c24eed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.202 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Start _get_guest_xml network_info=[{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.210 182627 WARNING nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.218 182627 DEBUG nova.virt.libvirt.host [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.219 182627 DEBUG nova.virt.libvirt.host [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.238 182627 DEBUG nova.virt.libvirt.host [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.239 182627 DEBUG nova.virt.libvirt.host [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.242 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.242 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.243 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.244 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.244 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.245 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.245 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.246 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.246 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.247 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.248 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.248 182627 DEBUG nova.virt.hardware [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.257 182627 DEBUG nova.virt.libvirt.vif [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1001577903',display_name='tempest-TestNetworkBasicOps-server-1001577903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1001577903',id=165,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEarH5Oom9ikHlXd4qR1fqXmt95dmoizhFBtQZ+VJrYjfo4gWE5XcKfo2GdT5UuzqpCae4jcETw/nZpgv7EdSOjq2koqsL81q7akAsbAiRbKuQt0+cIBgw492k8qLNufQ==',key_name='tempest-TestNetworkBasicOps-1119247128',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-o2kxyzgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:02Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=b1398aa0-c60b-4f85-8470-bf1860e92421,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.258 182627 DEBUG nova.network.os_vif_util [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.259 182627 DEBUG nova.network.os_vif_util [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.261 182627 DEBUG nova.objects.instance [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'pci_devices' on Instance uuid b1398aa0-c60b-4f85-8470-bf1860e92421 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.282 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <uuid>b1398aa0-c60b-4f85-8470-bf1860e92421</uuid>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <name>instance-000000a5</name>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestNetworkBasicOps-server-1001577903</nova:name>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:49:08</nova:creationTime>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:user uuid="b6d72b45b07b4237a9bb58e93cc801f2">tempest-TestNetworkBasicOps-645382902-project-member</nova:user>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:project uuid="ffd58948cb444c25ae034a02c0344de7">tempest-TestNetworkBasicOps-645382902</nova:project>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        <nova:port uuid="cd7bb548-040e-4490-b6ad-99fce3c24eed">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="serial">b1398aa0-c60b-4f85-8470-bf1860e92421</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="uuid">b1398aa0-c60b-4f85-8470-bf1860e92421</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.config"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:d0:8d:28"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <target dev="tapcd7bb548-04"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/console.log" append="off"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:49:08 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:49:08 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:49:08 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:49:08 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.283 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Preparing to wait for external event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.283 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.283 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.283 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.284 182627 DEBUG nova.virt.libvirt.vif [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:49:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1001577903',display_name='tempest-TestNetworkBasicOps-server-1001577903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1001577903',id=165,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEarH5Oom9ikHlXd4qR1fqXmt95dmoizhFBtQZ+VJrYjfo4gWE5XcKfo2GdT5UuzqpCae4jcETw/nZpgv7EdSOjq2koqsL81q7akAsbAiRbKuQt0+cIBgw492k8qLNufQ==',key_name='tempest-TestNetworkBasicOps-1119247128',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-o2kxyzgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:49:02Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=b1398aa0-c60b-4f85-8470-bf1860e92421,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.284 182627 DEBUG nova.network.os_vif_util [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.285 182627 DEBUG nova.network.os_vif_util [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.285 182627 DEBUG os_vif [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.286 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.286 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.286 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.290 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.290 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd7bb548-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.291 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd7bb548-04, col_values=(('external_ids', {'iface-id': 'cd7bb548-040e-4490-b6ad-99fce3c24eed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:8d:28', 'vm-uuid': 'b1398aa0-c60b-4f85-8470-bf1860e92421'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:08 np0005592767 NetworkManager[54973]: <info>  [1769122148.2935] manager: (tapcd7bb548-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.294 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.301 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.302 182627 INFO os_vif [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04')#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.379 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.380 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.380 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] No VIF found with MAC fa:16:3e:d0:8d:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:49:08 np0005592767 nova_compute[182623]: 2026-01-22 22:49:08.380 182627 INFO nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Using config drive#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.296 182627 INFO nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Creating config drive at /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.config#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.306 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpytwbkq0v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.445 182627 DEBUG oslo_concurrency.processutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpytwbkq0v" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:49:09 np0005592767 kernel: tapcd7bb548-04: entered promiscuous mode
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.5211] manager: (tapcd7bb548-04): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.522 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:09Z|00725|binding|INFO|Claiming lport cd7bb548-040e-4490-b6ad-99fce3c24eed for this chassis.
Jan 22 17:49:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:09Z|00726|binding|INFO|cd7bb548-040e-4490-b6ad-99fce3c24eed: Claiming fa:16:3e:d0:8d:28 10.100.0.5
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.541 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:8d:28 10.100.0.5'], port_security=['fa:16:3e:d0:8d:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b1398aa0-c60b-4f85-8470-bf1860e92421', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ac9f56c-756c-4bae-b022-893ccfd8049d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec7fe50b-af21-462f-9bec-fc3a2a2489b8, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cd7bb548-040e-4490-b6ad-99fce3c24eed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.542 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cd7bb548-040e-4490-b6ad-99fce3c24eed in datapath 155ecbc3-b59d-47dc-a85e-f58d50789f60 bound to our chassis#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.545 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 155ecbc3-b59d-47dc-a85e-f58d50789f60#033[00m
Jan 22 17:49:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:09Z|00727|binding|INFO|Setting lport cd7bb548-040e-4490-b6ad-99fce3c24eed up in Southbound
Jan 22 17:49:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:09Z|00728|binding|INFO|Setting lport cd7bb548-040e-4490-b6ad-99fce3c24eed ovn-installed in OVS
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.546 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 systemd-udevd[237167]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.547 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.558 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ad9c13-dc8c-48af-831d-34c637e5a1c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.559 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap155ecbc3-b1 in ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.5620] device (tapcd7bb548-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.5626] device (tapcd7bb548-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.562 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap155ecbc3-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.562 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4c364a-85dd-40a8-8590-278cc3f0ed6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.563 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b7efe5ac-a6b2-4ee5-9d94-6ea0b05fe958]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 systemd-machined[153912]: New machine qemu-88-instance-000000a5.
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.578 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[1c446a2f-9e21-40e4-a80f-d2773d0f8be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 systemd[1]: Started Virtual Machine qemu-88-instance-000000a5.
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.604 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[be7181e1-0c5e-45cd-b9ec-f4903861f3cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.641 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb05dbd-dfbc-4863-9572-afadb4fc8fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.6490] manager: (tap155ecbc3-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.649 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[accc9fcf-5929-403b-bf29-09abb5c383d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.689 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6b88e369-1035-4835-931a-0115e162eb8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.693 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[9c14cc66-38b7-4a81-bc56-f330e63604d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.7176] device (tap155ecbc3-b0): carrier: link connected
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.722 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3493bf4a-0334-41f4-b082-eddef7e7518f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.739 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[13593864-8092-41a1-b625-510714243254]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap155ecbc3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:a3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574629, 'reachable_time': 36583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237203, 'error': None, 'target': 'ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.755 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1f07faa3-58ab-4277-8a6b-deec7b5a3c67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:a3d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574629, 'tstamp': 574629}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237204, 'error': None, 'target': 'ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.769 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc3216a-7954-4991-b947-aac76e3a19be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap155ecbc3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:a3:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574629, 'reachable_time': 36583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237205, 'error': None, 'target': 'ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.800 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9640b30c-b1b4-4854-9e7c-b89b12e1caea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.859 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[41338cb3-b21e-4808-bfe5-2c26efa99e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.861 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap155ecbc3-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.861 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.861 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap155ecbc3-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:09 np0005592767 NetworkManager[54973]: <info>  [1769122149.8643] manager: (tap155ecbc3-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Jan 22 17:49:09 np0005592767 kernel: tap155ecbc3-b0: entered promiscuous mode
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.863 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.866 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.868 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122149.8678432, b1398aa0-c60b-4f85-8470-bf1860e92421 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.868 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] VM Started (Lifecycle Event)#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.869 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap155ecbc3-b0, col_values=(('external_ids', {'iface-id': '565ba943-8c16-4ef6-8a84-e44dd49c47d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:09 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:09Z|00729|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.870 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.871 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/155ecbc3-b59d-47dc-a85e-f58d50789f60.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/155ecbc3-b59d-47dc-a85e-f58d50789f60.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.872 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[45dd990a-dd97-4d32-9cd6-5e0d6bac2ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.873 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-155ecbc3-b59d-47dc-a85e-f58d50789f60
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/155ecbc3-b59d-47dc-a85e-f58d50789f60.pid.haproxy
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 155ecbc3-b59d-47dc-a85e-f58d50789f60
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:49:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:09.874 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'env', 'PROCESS_TAG=haproxy-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/155ecbc3-b59d-47dc-a85e-f58d50789f60.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.881 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.889 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.893 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122149.868052, b1398aa0-c60b-4f85-8470-bf1860e92421 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.893 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.917 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.920 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:49:09 np0005592767 nova_compute[182623]: 2026-01-22 22:49:09.945 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:49:10 np0005592767 podman[237244]: 2026-01-22 22:49:10.270546699 +0000 UTC m=+0.057636541 container create f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:49:10 np0005592767 systemd[1]: Started libpod-conmon-f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36.scope.
Jan 22 17:49:10 np0005592767 podman[237244]: 2026-01-22 22:49:10.237448213 +0000 UTC m=+0.024538035 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:49:10 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:49:10 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8eca81372ca5b3997c6963068193b256ca830ff126c45373efbea518cefc0aaa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:49:10 np0005592767 podman[237244]: 2026-01-22 22:49:10.360406811 +0000 UTC m=+0.147496613 container init f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:49:10 np0005592767 podman[237244]: 2026-01-22 22:49:10.36885029 +0000 UTC m=+0.155940092 container start f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:49:10 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [NOTICE]   (237264) : New worker (237266) forked
Jan 22 17:49:10 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [NOTICE]   (237264) : Loading success.
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.939 182627 DEBUG nova.compute.manager [req-13e02398-4950-4380-bf4d-e5d2e12998f1 req-065902a9-26ce-4dd3-9283-8891c6457c65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.939 182627 DEBUG oslo_concurrency.lockutils [req-13e02398-4950-4380-bf4d-e5d2e12998f1 req-065902a9-26ce-4dd3-9283-8891c6457c65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.940 182627 DEBUG oslo_concurrency.lockutils [req-13e02398-4950-4380-bf4d-e5d2e12998f1 req-065902a9-26ce-4dd3-9283-8891c6457c65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.940 182627 DEBUG oslo_concurrency.lockutils [req-13e02398-4950-4380-bf4d-e5d2e12998f1 req-065902a9-26ce-4dd3-9283-8891c6457c65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.940 182627 DEBUG nova.compute.manager [req-13e02398-4950-4380-bf4d-e5d2e12998f1 req-065902a9-26ce-4dd3-9283-8891c6457c65 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Processing event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.941 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.944 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122150.9438608, b1398aa0-c60b-4f85-8470-bf1860e92421 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.944 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.945 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.948 182627 INFO nova.virt.libvirt.driver [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance spawned successfully.#033[00m
Jan 22 17:49:10 np0005592767 nova_compute[182623]: 2026-01-22 22:49:10.948 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.112 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.112 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.113 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.113 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.114 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.114 182627 DEBUG nova.virt.libvirt.driver [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.120 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.123 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.157 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.187 182627 INFO nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Took 8.63 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.187 182627 DEBUG nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.221 182627 DEBUG nova.network.neutron [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updated VIF entry in instance network info cache for port cd7bb548-040e-4490-b6ad-99fce3c24eed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.222 182627 DEBUG nova.network.neutron [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updating instance_info_cache with network_info: [{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.256 182627 DEBUG oslo_concurrency.lockutils [req-201b1e7b-56cd-4774-95b1-625303903d0f req-fa3e5fe1-d342-48ed-b641-ebf856a2af07 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.273 182627 INFO nova.compute.manager [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Took 9.18 seconds to build instance.#033[00m
Jan 22 17:49:11 np0005592767 nova_compute[182623]: 2026-01-22 22:49:11.291 182627 DEBUG oslo_concurrency.lockutils [None req-15e37604-bc4c-4e12-b586-0f8e64452758 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:12.119 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:12.120 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:12.120 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:12 np0005592767 podman[237275]: 2026-01-22 22:49:12.175673737 +0000 UTC m=+0.085990013 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:49:12 np0005592767 nova_compute[182623]: 2026-01-22 22:49:12.788 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.076 182627 DEBUG nova.compute.manager [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.076 182627 DEBUG oslo_concurrency.lockutils [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.077 182627 DEBUG oslo_concurrency.lockutils [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.077 182627 DEBUG oslo_concurrency.lockutils [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.077 182627 DEBUG nova.compute.manager [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] No waiting events found dispatching network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.077 182627 WARNING nova.compute.manager [req-e3222899-f10c-4df5-aee4-cdec2bc11139 req-ca842c32-2303-4ce8-a35a-f25ea1fdd4d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received unexpected event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed for instance with vm_state active and task_state None.#033[00m
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.293 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:13Z|00730|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:13Z|00731|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:13 np0005592767 nova_compute[182623]: 2026-01-22 22:49:13.897 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:14 np0005592767 nova_compute[182623]: 2026-01-22 22:49:14.513 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122139.5118766, c07ca635-3e21-4a87-919b-1eeca64c5282 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:49:14 np0005592767 nova_compute[182623]: 2026-01-22 22:49:14.514 182627 INFO nova.compute.manager [-] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:49:14 np0005592767 nova_compute[182623]: 2026-01-22 22:49:14.543 182627 DEBUG nova.compute.manager [None req-24a97d2f-2dc2-4dee-a0ea-1795cea594fb - - - - - -] [instance: c07ca635-3e21-4a87-919b-1eeca64c5282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:15 np0005592767 NetworkManager[54973]: <info>  [1769122155.1800] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:15 np0005592767 NetworkManager[54973]: <info>  [1769122155.1820] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.309 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:15 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:15Z|00732|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.468 182627 DEBUG nova.compute.manager [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.470 182627 DEBUG nova.compute.manager [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing instance network info cache due to event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.470 182627 DEBUG oslo_concurrency.lockutils [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.471 182627 DEBUG oslo_concurrency.lockutils [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:49:15 np0005592767 nova_compute[182623]: 2026-01-22 22:49:15.471 182627 DEBUG nova.network.neutron [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing network info cache for port cd7bb548-040e-4490-b6ad-99fce3c24eed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:49:17 np0005592767 podman[237300]: 2026-01-22 22:49:17.174270918 +0000 UTC m=+0.079595903 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:49:17 np0005592767 podman[237299]: 2026-01-22 22:49:17.19625346 +0000 UTC m=+0.112640858 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:49:17 np0005592767 nova_compute[182623]: 2026-01-22 22:49:17.386 182627 DEBUG nova.network.neutron [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updated VIF entry in instance network info cache for port cd7bb548-040e-4490-b6ad-99fce3c24eed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:49:17 np0005592767 nova_compute[182623]: 2026-01-22 22:49:17.387 182627 DEBUG nova.network.neutron [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updating instance_info_cache with network_info: [{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:17 np0005592767 nova_compute[182623]: 2026-01-22 22:49:17.410 182627 DEBUG oslo_concurrency.lockutils [req-8803ec71-364d-446d-a4a9-1c2409e14cf0 req-e819f178-099c-4707-8dd7-7336eb10ca5c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:17 np0005592767 nova_compute[182623]: 2026-01-22 22:49:17.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:18Z|00733|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:18 np0005592767 nova_compute[182623]: 2026-01-22 22:49:18.265 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:18 np0005592767 nova_compute[182623]: 2026-01-22 22:49:18.294 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:22 np0005592767 nova_compute[182623]: 2026-01-22 22:49:22.791 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:22Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:8d:28 10.100.0.5
Jan 22 17:49:22 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:22Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:8d:28 10.100.0.5
Jan 22 17:49:23 np0005592767 nova_compute[182623]: 2026-01-22 22:49:23.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:25 np0005592767 podman[237367]: 2026-01-22 22:49:25.182417438 +0000 UTC m=+0.090484090 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:49:25 np0005592767 podman[237366]: 2026-01-22 22:49:25.208601699 +0000 UTC m=+0.115115877 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:49:27 np0005592767 nova_compute[182623]: 2026-01-22 22:49:27.567 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:27 np0005592767 nova_compute[182623]: 2026-01-22 22:49:27.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:28 np0005592767 nova_compute[182623]: 2026-01-22 22:49:28.334 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:30 np0005592767 podman[237410]: 2026-01-22 22:49:30.129239173 +0000 UTC m=+0.055151771 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:49:30 np0005592767 nova_compute[182623]: 2026-01-22 22:49:30.294 182627 INFO nova.compute.manager [None req-f80b2adf-c7fb-4ce1-844e-2d57e73c249d b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Get console output#033[00m
Jan 22 17:49:30 np0005592767 nova_compute[182623]: 2026-01-22 22:49:30.299 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:49:30 np0005592767 nova_compute[182623]: 2026-01-22 22:49:30.817 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:30 np0005592767 nova_compute[182623]: 2026-01-22 22:49:30.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:30 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:30Z|00734|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:30 np0005592767 nova_compute[182623]: 2026-01-22 22:49:30.917 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:31 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:31Z|00735|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:31 np0005592767 nova_compute[182623]: 2026-01-22 22:49:31.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:31 np0005592767 nova_compute[182623]: 2026-01-22 22:49:31.913 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:31 np0005592767 nova_compute[182623]: 2026-01-22 22:49:31.913 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:32 np0005592767 nova_compute[182623]: 2026-01-22 22:49:32.151 182627 INFO nova.compute.manager [None req-fa3d55c6-39bd-4822-87d0-ab8b00d752a9 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Get console output#033[00m
Jan 22 17:49:32 np0005592767 nova_compute[182623]: 2026-01-22 22:49:32.155 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:49:32 np0005592767 nova_compute[182623]: 2026-01-22 22:49:32.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:32 np0005592767 nova_compute[182623]: 2026-01-22 22:49:32.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:33 np0005592767 nova_compute[182623]: 2026-01-22 22:49:33.336 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:33 np0005592767 nova_compute[182623]: 2026-01-22 22:49:33.883 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:33 np0005592767 NetworkManager[54973]: <info>  [1769122173.8849] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 22 17:49:33 np0005592767 NetworkManager[54973]: <info>  [1769122173.8863] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 22 17:49:33 np0005592767 nova_compute[182623]: 2026-01-22 22:49:33.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:33 np0005592767 nova_compute[182623]: 2026-01-22 22:49:33.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:49:33 np0005592767 nova_compute[182623]: 2026-01-22 22:49:33.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:34Z|00736|binding|INFO|Releasing lport 565ba943-8c16-4ef6-8a84-e44dd49c47d6 from this chassis (sb_readonly=0)
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.082 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.227 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.228 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.228 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.229 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b1398aa0-c60b-4f85-8470-bf1860e92421 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.469 182627 INFO nova.compute.manager [None req-99e3bbe8-78b8-46c0-8016-b2ffe26a29d6 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Get console output#033[00m
Jan 22 17:49:34 np0005592767 nova_compute[182623]: 2026-01-22 22:49:34.474 211280 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.523 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.523 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.524 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.524 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.525 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.537 182627 INFO nova.compute.manager [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Terminating instance#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.546 182627 DEBUG nova.compute.manager [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:49:35 np0005592767 kernel: tapcd7bb548-04 (unregistering): left promiscuous mode
Jan 22 17:49:35 np0005592767 NetworkManager[54973]: <info>  [1769122175.5673] device (tapcd7bb548-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.607 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:35Z|00737|binding|INFO|Releasing lport cd7bb548-040e-4490-b6ad-99fce3c24eed from this chassis (sb_readonly=0)
Jan 22 17:49:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:35Z|00738|binding|INFO|Setting lport cd7bb548-040e-4490-b6ad-99fce3c24eed down in Southbound
Jan 22 17:49:35 np0005592767 ovn_controller[94769]: 2026-01-22T22:49:35Z|00739|binding|INFO|Removing iface tapcd7bb548-04 ovn-installed in OVS
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.610 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.616 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:8d:28 10.100.0.5'], port_security=['fa:16:3e:d0:8d:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b1398aa0-c60b-4f85-8470-bf1860e92421', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ffd58948cb444c25ae034a02c0344de7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ac9f56c-756c-4bae-b022-893ccfd8049d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec7fe50b-af21-462f-9bec-fc3a2a2489b8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cd7bb548-040e-4490-b6ad-99fce3c24eed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.618 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cd7bb548-040e-4490-b6ad-99fce3c24eed in datapath 155ecbc3-b59d-47dc-a85e-f58d50789f60 unbound from our chassis#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.620 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 155ecbc3-b59d-47dc-a85e-f58d50789f60, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.622 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[12d55260-7281-4f64-92fa-904739e0dbea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.623 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60 namespace which is not needed anymore#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.632 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 22 17:49:35 np0005592767 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a5.scope: Consumed 12.672s CPU time.
Jan 22 17:49:35 np0005592767 systemd-machined[153912]: Machine qemu-88-instance-000000a5 terminated.
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [NOTICE]   (237264) : haproxy version is 2.8.14-c23fe91
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [NOTICE]   (237264) : path to executable is /usr/sbin/haproxy
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [WARNING]  (237264) : Exiting Master process...
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [WARNING]  (237264) : Exiting Master process...
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [ALERT]    (237264) : Current worker (237266) exited with code 143 (Terminated)
Jan 22 17:49:35 np0005592767 neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60[237260]: [WARNING]  (237264) : All workers exited. Exiting... (0)
Jan 22 17:49:35 np0005592767 systemd[1]: libpod-f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36.scope: Deactivated successfully.
Jan 22 17:49:35 np0005592767 podman[237461]: 2026-01-22 22:49:35.785487666 +0000 UTC m=+0.050544341 container died f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 17:49:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36-userdata-shm.mount: Deactivated successfully.
Jan 22 17:49:35 np0005592767 systemd[1]: var-lib-containers-storage-overlay-8eca81372ca5b3997c6963068193b256ca830ff126c45373efbea518cefc0aaa-merged.mount: Deactivated successfully.
Jan 22 17:49:35 np0005592767 podman[237461]: 2026-01-22 22:49:35.823209713 +0000 UTC m=+0.088266378 container cleanup f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.830 182627 INFO nova.virt.libvirt.driver [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance destroyed successfully.#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.830 182627 DEBUG nova.objects.instance [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lazy-loading 'resources' on Instance uuid b1398aa0-c60b-4f85-8470-bf1860e92421 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:49:35 np0005592767 systemd[1]: libpod-conmon-f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36.scope: Deactivated successfully.
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.857 182627 DEBUG nova.virt.libvirt.vif [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:49:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1001577903',display_name='tempest-TestNetworkBasicOps-server-1001577903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1001577903',id=165,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEarH5Oom9ikHlXd4qR1fqXmt95dmoizhFBtQZ+VJrYjfo4gWE5XcKfo2GdT5UuzqpCae4jcETw/nZpgv7EdSOjq2koqsL81q7akAsbAiRbKuQt0+cIBgw492k8qLNufQ==',key_name='tempest-TestNetworkBasicOps-1119247128',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:49:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ffd58948cb444c25ae034a02c0344de7',ramdisk_id='',reservation_id='r-o2kxyzgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-645382902',owner_user_name='tempest-TestNetworkBasicOps-645382902-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:49:11Z,user_data=None,user_id='b6d72b45b07b4237a9bb58e93cc801f2',uuid=b1398aa0-c60b-4f85-8470-bf1860e92421,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.857 182627 DEBUG nova.network.os_vif_util [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converting VIF {"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.858 182627 DEBUG nova.network.os_vif_util [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.859 182627 DEBUG os_vif [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.861 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.861 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd7bb548-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.864 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.865 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.867 182627 INFO os_vif [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:8d:28,bridge_name='br-int',has_traffic_filtering=True,id=cd7bb548-040e-4490-b6ad-99fce3c24eed,network=Network(155ecbc3-b59d-47dc-a85e-f58d50789f60),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd7bb548-04')#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.868 182627 INFO nova.virt.libvirt.driver [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Deleting instance files /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421_del#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.869 182627 INFO nova.virt.libvirt.driver [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Deletion of /var/lib/nova/instances/b1398aa0-c60b-4f85-8470-bf1860e92421_del complete#033[00m
Jan 22 17:49:35 np0005592767 podman[237507]: 2026-01-22 22:49:35.893644595 +0000 UTC m=+0.046796075 container remove f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.899 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3c37de-6891-44a2-b66c-a15fe2ab294b]: (4, ('Thu Jan 22 10:49:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60 (f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36)\nf5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36\nThu Jan 22 10:49:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60 (f5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36)\nf5cc3ea9c0d5b881fa1dc064fcf141f2d358cdbf97d10ac0ceaddcbd72f24e36\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.901 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[aecf3834-1a4d-4f57-9939-0001d0bbb1e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.902 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap155ecbc3-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:49:35 np0005592767 kernel: tap155ecbc3-b0: left promiscuous mode
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.904 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.908 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d63dd7c2-0a14-49ec-bd27-e76e8452779c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.919 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.930 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d53361-e4af-4401-b33c-38e46488d218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.931 182627 INFO nova.compute.manager [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.931 182627 DEBUG oslo.service.loopingcall [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.932 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[243e21ca-ce86-4b77-8d12-68706dbc9a0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.932 182627 DEBUG nova.compute.manager [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:49:35 np0005592767 nova_compute[182623]: 2026-01-22 22:49:35.932 182627 DEBUG nova.network.neutron [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.947 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[917e48b0-dfa6-42e7-b5df-832021dafa33]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574621, 'reachable_time': 44185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237523, 'error': None, 'target': 'ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.949 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-155ecbc3-b59d-47dc-a85e-f58d50789f60 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:49:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:49:35.950 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e70abae4-6044-40f8-a3df-df3f9231a191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:49:35 np0005592767 systemd[1]: run-netns-ovnmeta\x2d155ecbc3\x2db59d\x2d47dc\x2da85e\x2df58d50789f60.mount: Deactivated successfully.
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.178 182627 DEBUG nova.network.neutron [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.197 182627 INFO nova.compute.manager [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Took 1.27 seconds to deallocate network for instance.#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.267 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.268 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.274 182627 DEBUG nova.compute.manager [req-93433796-d359-45f0-88a2-9a8938d89443 req-96d1345a-485b-431c-852c-ba72c8b8e4e1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-vif-deleted-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.401 182627 DEBUG nova.compute.provider_tree [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.420 182627 DEBUG nova.scheduler.client.report [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.451 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.491 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updating instance_info_cache with network_info: [{"id": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "address": "fa:16:3e:d0:8d:28", "network": {"id": "155ecbc3-b59d-47dc-a85e-f58d50789f60", "bridge": "br-int", "label": "tempest-network-smoke--1826500480", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ffd58948cb444c25ae034a02c0344de7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd7bb548-04", "ovs_interfaceid": "cd7bb548-040e-4490-b6ad-99fce3c24eed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.522 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.523 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.524 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.549 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.550 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.550 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.550 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.552 182627 INFO nova.scheduler.client.report [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Deleted allocations for instance b1398aa0-c60b-4f85-8470-bf1860e92421#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.646 182627 DEBUG oslo_concurrency.lockutils [None req-368a6141-c959-4703-b98d-3b2aa6de0622 b6d72b45b07b4237a9bb58e93cc801f2 ffd58948cb444c25ae034a02c0344de7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.706 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.706 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing instance network info cache due to event network-changed-cd7bb548-040e-4490-b6ad-99fce3c24eed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.706 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.707 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.707 182627 DEBUG nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Refreshing network info cache for port cd7bb548-040e-4490-b6ad-99fce3c24eed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.738 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.739 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5692MB free_disk=73.05248641967773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.739 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.739 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.796 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.828 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.829 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.849 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.862 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.890 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:49:37 np0005592767 nova_compute[182623]: 2026-01-22 22:49:37.890 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.228 182627 INFO nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Port cd7bb548-040e-4490-b6ad-99fce3c24eed from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.228 182627 DEBUG nova.network.neutron [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.229 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-b1398aa0-c60b-4f85-8470-bf1860e92421" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.229 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-vif-unplugged-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.229 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.230 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.230 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.230 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] No waiting events found dispatching network-vif-unplugged-cd7bb548-040e-4490-b6ad-99fce3c24eed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.230 182627 WARNING nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received unexpected event network-vif-unplugged-cd7bb548-040e-4490-b6ad-99fce3c24eed for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.231 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.231 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.231 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.232 182627 DEBUG oslo_concurrency.lockutils [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "b1398aa0-c60b-4f85-8470-bf1860e92421-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.232 182627 DEBUG nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] No waiting events found dispatching network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.232 182627 WARNING nova.compute.manager [req-590c3706-df10-4a56-a44b-01df369af435 req-15286354-2362-417c-bc8e-fb55d919bb84 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Received unexpected event network-vif-plugged-cd7bb548-040e-4490-b6ad-99fce3c24eed for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.377 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:38 np0005592767 nova_compute[182623]: 2026-01-22 22:49:38.540 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:39 np0005592767 nova_compute[182623]: 2026-01-22 22:49:39.264 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:40 np0005592767 nova_compute[182623]: 2026-01-22 22:49:40.863 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:40 np0005592767 nova_compute[182623]: 2026-01-22 22:49:40.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:40 np0005592767 nova_compute[182623]: 2026-01-22 22:49:40.895 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:40 np0005592767 nova_compute[182623]: 2026-01-22 22:49:40.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:49:42 np0005592767 nova_compute[182623]: 2026-01-22 22:49:42.798 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:43 np0005592767 podman[237526]: 2026-01-22 22:49:43.144151831 +0000 UTC m=+0.062134009 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:49:45 np0005592767 nova_compute[182623]: 2026-01-22 22:49:45.865 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:45 np0005592767 nova_compute[182623]: 2026-01-22 22:49:45.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:46 np0005592767 nova_compute[182623]: 2026-01-22 22:49:46.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:46 np0005592767 nova_compute[182623]: 2026-01-22 22:49:46.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:49:46 np0005592767 nova_compute[182623]: 2026-01-22 22:49:46.909 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:49:47 np0005592767 nova_compute[182623]: 2026-01-22 22:49:47.799 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:47 np0005592767 nova_compute[182623]: 2026-01-22 22:49:47.904 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:48 np0005592767 podman[237549]: 2026-01-22 22:49:48.146358313 +0000 UTC m=+0.098608680 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350)
Jan 22 17:49:48 np0005592767 podman[237548]: 2026-01-22 22:49:48.202445279 +0000 UTC m=+0.153179473 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:49:49 np0005592767 nova_compute[182623]: 2026-01-22 22:49:49.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:49:49 np0005592767 nova_compute[182623]: 2026-01-22 22:49:49.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:49:50 np0005592767 nova_compute[182623]: 2026-01-22 22:49:50.828 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122175.8274968, b1398aa0-c60b-4f85-8470-bf1860e92421 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:49:50 np0005592767 nova_compute[182623]: 2026-01-22 22:49:50.829 182627 INFO nova.compute.manager [-] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:49:50 np0005592767 nova_compute[182623]: 2026-01-22 22:49:50.847 182627 DEBUG nova.compute.manager [None req-3d944e36-7f5f-4108-ad84-3ca51c258d7f - - - - - -] [instance: b1398aa0-c60b-4f85-8470-bf1860e92421] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:49:50 np0005592767 nova_compute[182623]: 2026-01-22 22:49:50.868 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:52 np0005592767 nova_compute[182623]: 2026-01-22 22:49:52.802 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:55 np0005592767 nova_compute[182623]: 2026-01-22 22:49:55.905 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:49:56 np0005592767 podman[237592]: 2026-01-22 22:49:56.153430001 +0000 UTC m=+0.067861301 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 17:49:56 np0005592767 podman[237593]: 2026-01-22 22:49:56.185053645 +0000 UTC m=+0.096677095 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:49:57 np0005592767 nova_compute[182623]: 2026-01-22 22:49:57.804 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:00 np0005592767 nova_compute[182623]: 2026-01-22 22:50:00.940 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:01 np0005592767 podman[237634]: 2026-01-22 22:50:01.132494848 +0000 UTC m=+0.053472073 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:50:02 np0005592767 nova_compute[182623]: 2026-01-22 22:50:02.807 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:05 np0005592767 nova_compute[182623]: 2026-01-22 22:50:05.966 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:50:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.446 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.447 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.465 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.562 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.562 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.571 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.572 182627 INFO nova.compute.claims [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.796 182627 DEBUG nova.compute.provider_tree [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.809 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.813 182627 DEBUG nova.scheduler.client.report [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.835 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.836 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.896 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.897 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.920 182627 INFO nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:50:07 np0005592767 nova_compute[182623]: 2026-01-22 22:50:07.949 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.066 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.067 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.067 182627 INFO nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Creating image(s)#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.068 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.068 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.069 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.084 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.103 182627 DEBUG nova.policy [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.145 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.146 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.146 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.158 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.211 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.212 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.244 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.246 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.247 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.306 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.307 182627 DEBUG nova.virt.disk.api [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.308 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.403 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.405 182627 DEBUG nova.virt.disk.api [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.406 182627 DEBUG nova.objects.instance [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid 59a7ea0d-16ce-4a74-9c5b-ee970251c696 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.419 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.420 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Ensure instance console log exists: /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.421 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.421 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.422 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:08.792 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:08 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:08.794 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:50:08 np0005592767 nova_compute[182623]: 2026-01-22 22:50:08.901 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Successfully created port: 6328f179-0f3e-4d6d-9928-1a555225eb43 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:50:10 np0005592767 nova_compute[182623]: 2026-01-22 22:50:10.426 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Successfully updated port: 6328f179-0f3e-4d6d-9928-1a555225eb43 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:50:10 np0005592767 nova_compute[182623]: 2026-01-22 22:50:10.443 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:50:10 np0005592767 nova_compute[182623]: 2026-01-22 22:50:10.443 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:50:10 np0005592767 nova_compute[182623]: 2026-01-22 22:50:10.444 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:50:10 np0005592767 nova_compute[182623]: 2026-01-22 22:50:10.967 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:11 np0005592767 nova_compute[182623]: 2026-01-22 22:50:11.002 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:50:11 np0005592767 nova_compute[182623]: 2026-01-22 22:50:11.161 182627 DEBUG nova.compute.manager [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:50:11 np0005592767 nova_compute[182623]: 2026-01-22 22:50:11.162 182627 DEBUG nova.compute.manager [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing instance network info cache due to event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:50:11 np0005592767 nova_compute[182623]: 2026-01-22 22:50:11.162 182627 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.034 182627 DEBUG nova.network.neutron [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.057 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.058 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Instance network_info: |[{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.059 182627 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.059 182627 DEBUG nova.network.neutron [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing network info cache for port 6328f179-0f3e-4d6d-9928-1a555225eb43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.065 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Start _get_guest_xml network_info=[{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.073 182627 WARNING nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.082 182627 DEBUG nova.virt.libvirt.host [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.083 182627 DEBUG nova.virt.libvirt.host [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.094 182627 DEBUG nova.virt.libvirt.host [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.096 182627 DEBUG nova.virt.libvirt.host [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.098 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.098 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.099 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.100 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.100 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.101 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.101 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.102 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.102 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.103 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.103 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.104 182627 DEBUG nova.virt.hardware [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.112 182627 DEBUG nova.virt.libvirt.vif [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=167,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh4l03Ed713MtSwBd/OpoUvgw2ZZfNGVsi0SV7wM77cPFt45nYMbcyVnDiB21R43mQRNOkRiPSap7J4JYwIHs2li3zkTeuIl+BTd0GCWa2RVxYz3GyzqOITgWwQhVQxNQ==',key_name='tempest-TestSecurityGroupsBasicOps-840099720',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-dv9a1l5o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:50:07Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=59a7ea0d-16ce-4a74-9c5b-ee970251c696,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.113 182627 DEBUG nova.network.os_vif_util [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.114 182627 DEBUG nova.network.os_vif_util [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.116 182627 DEBUG nova.objects.instance [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid 59a7ea0d-16ce-4a74-9c5b-ee970251c696 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:50:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:12.120 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:12.121 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:12.121 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.142 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <uuid>59a7ea0d-16ce-4a74-9c5b-ee970251c696</uuid>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <name>instance-000000a7</name>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862</nova:name>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:50:12</nova:creationTime>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        <nova:port uuid="6328f179-0f3e-4d6d-9928-1a555225eb43">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="serial">59a7ea0d-16ce-4a74-9c5b-ee970251c696</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="uuid">59a7ea0d-16ce-4a74-9c5b-ee970251c696</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.config"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:10:8a:c6"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <target dev="tap6328f179-0f"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/console.log" append="off"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:50:12 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:50:12 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:50:12 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:50:12 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.143 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Preparing to wait for external event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.144 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.144 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.145 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.146 182627 DEBUG nova.virt.libvirt.vif [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=167,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh4l03Ed713MtSwBd/OpoUvgw2ZZfNGVsi0SV7wM77cPFt45nYMbcyVnDiB21R43mQRNOkRiPSap7J4JYwIHs2li3zkTeuIl+BTd0GCWa2RVxYz3GyzqOITgWwQhVQxNQ==',key_name='tempest-TestSecurityGroupsBasicOps-840099720',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-dv9a1l5o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:50:07Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=59a7ea0d-16ce-4a74-9c5b-ee970251c696,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.146 182627 DEBUG nova.network.os_vif_util [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.148 182627 DEBUG nova.network.os_vif_util [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.149 182627 DEBUG os_vif [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.151 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.152 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.156 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6328f179-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.157 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6328f179-0f, col_values=(('external_ids', {'iface-id': '6328f179-0f3e-4d6d-9928-1a555225eb43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:8a:c6', 'vm-uuid': '59a7ea0d-16ce-4a74-9c5b-ee970251c696'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:12 np0005592767 NetworkManager[54973]: <info>  [1769122212.1594] manager: (tap6328f179-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.161 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.165 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.166 182627 INFO os_vif [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f')#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.224 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.224 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.225 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:10:8a:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.225 182627 INFO nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Using config drive#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.810 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.953 182627 INFO nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Creating config drive at /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.config#033[00m
Jan 22 17:50:12 np0005592767 nova_compute[182623]: 2026-01-22 22:50:12.958 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_ef2uvd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.081 182627 DEBUG oslo_concurrency.processutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_ef2uvd" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:13 np0005592767 kernel: tap6328f179-0f: entered promiscuous mode
Jan 22 17:50:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:13Z|00740|binding|INFO|Claiming lport 6328f179-0f3e-4d6d-9928-1a555225eb43 for this chassis.
Jan 22 17:50:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:13Z|00741|binding|INFO|6328f179-0f3e-4d6d-9928-1a555225eb43: Claiming fa:16:3e:10:8a:c6 10.100.0.6
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.1459] manager: (tap6328f179-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.144 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.158 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:8a:c6 10.100.0.6'], port_security=['fa:16:3e:10:8a:c6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '59a7ea0d-16ce-4a74-9c5b-ee970251c696', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8fc155a3-6ad0-4ec0-9fb6-b95cfed7cf3d d742843f-b8f4-4edd-aab0-ea2a112374a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bfd8330-9aaa-4207-a5ef-fd1434836aa3, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6328f179-0f3e-4d6d-9928-1a555225eb43) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.160 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6328f179-0f3e-4d6d-9928-1a555225eb43 in datapath c751c9ce-eaff-44e8-9be1-aa0c9e70af9b bound to our chassis#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.161 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c751c9ce-eaff-44e8-9be1-aa0c9e70af9b#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.172 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9882f214-2012-4bca-82fb-528a7f911bce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.173 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc751c9ce-e1 in ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.175 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc751c9ce-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.175 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5770e923-a6c3-46a2-acd7-572746b27f52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.175 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6a90fb9c-9542-42f5-9863-0a21d0a94407]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.186 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[bd81741d-c879-494c-9d3c-523b7f79a5c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.219 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4852bb29-dc76-4ce1-9b98-6813d1aae572]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.237 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 systemd-udevd[237703]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:50:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:13Z|00742|binding|INFO|Setting lport 6328f179-0f3e-4d6d-9928-1a555225eb43 ovn-installed in OVS
Jan 22 17:50:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:13Z|00743|binding|INFO|Setting lport 6328f179-0f3e-4d6d-9928-1a555225eb43 up in Southbound
Jan 22 17:50:13 np0005592767 systemd-machined[153912]: New machine qemu-89-instance-000000a7.
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.245 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.2582] device (tap6328f179-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.2594] device (tap6328f179-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:50:13 np0005592767 systemd[1]: Started Virtual Machine qemu-89-instance-000000a7.
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.263 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dd8870-f479-4a66-8d91-e78843848730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.269 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b27d5f4e-aeab-4577-9bea-9610af360c84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.2765] manager: (tapc751c9ce-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.301 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[5478bbab-40e5-4b7f-9147-315640457415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.305 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1b617e69-90b5-4be1-b96f-58da847e9649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 podman[237690]: 2026-01-22 22:50:13.316283289 +0000 UTC m=+0.090452319 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.3293] device (tapc751c9ce-e0): carrier: link connected
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.335 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3bb062-74d6-4de9-9c77-467822822e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.356 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[db441b50-be5f-44a6-b252-6a16cd68d964]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc751c9ce-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:52:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580990, 'reachable_time': 15751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237744, 'error': None, 'target': 'ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.374 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e23cc8e2-8241-46a3-a673-aaf377ee40d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:525b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580990, 'tstamp': 580990}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237746, 'error': None, 'target': 'ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.393 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1571d06b-45b7-4f8d-97a8-bfc7c54b5406]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc751c9ce-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:52:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580990, 'reachable_time': 15751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237747, 'error': None, 'target': 'ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.421 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9adaca53-7efb-4dc4-90be-a06fb87c8562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.614 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4b089741-7696-453e-bb01-3432efc1f3a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.616 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc751c9ce-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.616 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.617 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc751c9ce-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.658 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 NetworkManager[54973]: <info>  [1769122213.6588] manager: (tapc751c9ce-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 22 17:50:13 np0005592767 kernel: tapc751c9ce-e0: entered promiscuous mode
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.662 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc751c9ce-e0, col_values=(('external_ids', {'iface-id': '1318d902-d993-4189-b722-3391f796c1ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.663 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:13Z|00744|binding|INFO|Releasing lport 1318d902-d993-4189-b722-3391f796c1ad from this chassis (sb_readonly=0)
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.664 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c751c9ce-eaff-44e8-9be1-aa0c9e70af9b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c751c9ce-eaff-44e8-9be1-aa0c9e70af9b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.665 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d2e5e6-5afb-4329-ba88-75308d3a4008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.666 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/c751c9ce-eaff-44e8-9be1-aa0c9e70af9b.pid.haproxy
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID c751c9ce-eaff-44e8-9be1-aa0c9e70af9b
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:50:13 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:13.667 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'env', 'PROCESS_TAG=haproxy-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c751c9ce-eaff-44e8-9be1-aa0c9e70af9b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.675 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.686 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122213.6853213, 59a7ea0d-16ce-4a74-9c5b-ee970251c696 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.686 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] VM Started (Lifecycle Event)#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.712 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.715 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122213.685527, 59a7ea0d-16ce-4a74-9c5b-ee970251c696 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.716 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.743 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.746 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:50:13 np0005592767 nova_compute[182623]: 2026-01-22 22:50:13.762 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:50:14 np0005592767 nova_compute[182623]: 2026-01-22 22:50:14.012 182627 DEBUG nova.network.neutron [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updated VIF entry in instance network info cache for port 6328f179-0f3e-4d6d-9928-1a555225eb43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:50:14 np0005592767 nova_compute[182623]: 2026-01-22 22:50:14.015 182627 DEBUG nova.network.neutron [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:50:14 np0005592767 nova_compute[182623]: 2026-01-22 22:50:14.035 182627 DEBUG oslo_concurrency.lockutils [req-2843f221-6661-4f4c-9bed-712aac033616 req-b22b9a2f-1866-4537-81ba-f0d9444ca3dd 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:50:14 np0005592767 podman[237786]: 2026-01-22 22:50:14.05623489 +0000 UTC m=+0.078281005 container create ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 17:50:14 np0005592767 systemd[1]: Started libpod-conmon-ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe.scope.
Jan 22 17:50:14 np0005592767 podman[237786]: 2026-01-22 22:50:14.018949795 +0000 UTC m=+0.040996000 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:50:14 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:50:14 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57265089bcc964df9660b73b8e9006d0c12be271d2eab87dc7c4ace01fc2af3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:50:14 np0005592767 podman[237786]: 2026-01-22 22:50:14.146580895 +0000 UTC m=+0.168627020 container init ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:50:14 np0005592767 podman[237786]: 2026-01-22 22:50:14.152572465 +0000 UTC m=+0.174618570 container start ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:50:14 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [NOTICE]   (237806) : New worker (237808) forked
Jan 22 17:50:14 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [NOTICE]   (237806) : Loading success.
Jan 22 17:50:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:50:14.797 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.047 182627 DEBUG nova.compute.manager [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.049 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.050 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.050 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.051 182627 DEBUG nova.compute.manager [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Processing event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.051 182627 DEBUG nova.compute.manager [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.052 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.052 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.052 182627 DEBUG oslo_concurrency.lockutils [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.053 182627 DEBUG nova.compute.manager [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] No waiting events found dispatching network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.053 182627 WARNING nova.compute.manager [req-3774a00d-473b-43a9-83d2-e0fd61aad176 req-22020ef4-a120-4a82-80d4-55b0dca1ca89 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received unexpected event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.055 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.060 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122215.0602934, 59a7ea0d-16ce-4a74-9c5b-ee970251c696 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.061 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.064 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.069 182627 INFO nova.virt.libvirt.driver [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Instance spawned successfully.#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.070 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.084 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.091 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.096 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.096 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.097 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.097 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.097 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.098 182627 DEBUG nova.virt.libvirt.driver [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.117 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.169 182627 INFO nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.169 182627 DEBUG nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.274 182627 INFO nova.compute.manager [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Took 7.75 seconds to build instance.#033[00m
Jan 22 17:50:15 np0005592767 nova_compute[182623]: 2026-01-22 22:50:15.294 182627 DEBUG oslo_concurrency.lockutils [None req-f634ea46-8645-4b3b-a7a3-33db06dedfa4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:17 np0005592767 NetworkManager[54973]: <info>  [1769122217.1602] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 22 17:50:17 np0005592767 NetworkManager[54973]: <info>  [1769122217.1618] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.229 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:17Z|00745|binding|INFO|Releasing lport 1318d902-d993-4189-b722-3391f796c1ad from this chassis (sb_readonly=0)
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.241 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.361 182627 DEBUG nova.compute.manager [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.361 182627 DEBUG nova.compute.manager [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing instance network info cache due to event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.361 182627 DEBUG oslo_concurrency.lockutils [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.361 182627 DEBUG oslo_concurrency.lockutils [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.362 182627 DEBUG nova.network.neutron [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing network info cache for port 6328f179-0f3e-4d6d-9928-1a555225eb43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:50:17 np0005592767 nova_compute[182623]: 2026-01-22 22:50:17.811 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:18 np0005592767 nova_compute[182623]: 2026-01-22 22:50:18.441 182627 DEBUG nova.network.neutron [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updated VIF entry in instance network info cache for port 6328f179-0f3e-4d6d-9928-1a555225eb43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:50:18 np0005592767 nova_compute[182623]: 2026-01-22 22:50:18.441 182627 DEBUG nova.network.neutron [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:50:18 np0005592767 nova_compute[182623]: 2026-01-22 22:50:18.459 182627 DEBUG oslo_concurrency.lockutils [req-cbccd0d9-4d52-47bf-b93e-c90c24003c3f req-b440aa6f-e3ca-4adf-9858-6c057cb3aebf 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:50:19 np0005592767 podman[237819]: 2026-01-22 22:50:19.142981634 +0000 UTC m=+0.058752963 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:50:19 np0005592767 podman[237818]: 2026-01-22 22:50:19.164907544 +0000 UTC m=+0.082090503 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:50:22 np0005592767 nova_compute[182623]: 2026-01-22 22:50:22.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:22 np0005592767 nova_compute[182623]: 2026-01-22 22:50:22.813 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:26Z|00746|binding|INFO|Releasing lport 1318d902-d993-4189-b722-3391f796c1ad from this chassis (sb_readonly=0)
Jan 22 17:50:26 np0005592767 nova_compute[182623]: 2026-01-22 22:50:26.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:26Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:8a:c6 10.100.0.6
Jan 22 17:50:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:50:26Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:8a:c6 10.100.0.6
Jan 22 17:50:27 np0005592767 nova_compute[182623]: 2026-01-22 22:50:27.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:27 np0005592767 podman[237884]: 2026-01-22 22:50:27.174171793 +0000 UTC m=+0.093100814 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:50:27 np0005592767 podman[237885]: 2026-01-22 22:50:27.185345539 +0000 UTC m=+0.098845247 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 17:50:27 np0005592767 nova_compute[182623]: 2026-01-22 22:50:27.818 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:32 np0005592767 nova_compute[182623]: 2026-01-22 22:50:32.170 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:32 np0005592767 podman[237924]: 2026-01-22 22:50:32.177262092 +0000 UTC m=+0.078162872 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:50:32 np0005592767 nova_compute[182623]: 2026-01-22 22:50:32.820 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:32 np0005592767 nova_compute[182623]: 2026-01-22 22:50:32.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:33 np0005592767 nova_compute[182623]: 2026-01-22 22:50:33.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:33 np0005592767 nova_compute[182623]: 2026-01-22 22:50:33.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:35 np0005592767 nova_compute[182623]: 2026-01-22 22:50:35.045 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:35 np0005592767 nova_compute[182623]: 2026-01-22 22:50:35.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:35 np0005592767 nova_compute[182623]: 2026-01-22 22:50:35.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:50:35 np0005592767 nova_compute[182623]: 2026-01-22 22:50:35.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.091 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.092 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.092 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.092 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 59a7ea0d-16ce-4a74-9c5b-ee970251c696 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.198 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:36 np0005592767 nova_compute[182623]: 2026-01-22 22:50:36.860 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:37 np0005592767 nova_compute[182623]: 2026-01-22 22:50:37.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:37 np0005592767 nova_compute[182623]: 2026-01-22 22:50:37.823 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.233 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [{"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.252 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.253 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.254 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.273 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Triggering sync for uuid 59a7ea0d-16ce-4a74-9c5b-ee970251c696 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.274 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.274 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.275 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.312 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.312 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.313 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.313 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.315 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.394 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.454 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.456 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.547 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.805 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.807 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5518MB free_disk=73.02383804321289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.808 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.808 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.903 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 59a7ea0d-16ce-4a74-9c5b-ee970251c696 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.904 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.905 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.955 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:50:38 np0005592767 nova_compute[182623]: 2026-01-22 22:50:38.969 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:50:39 np0005592767 nova_compute[182623]: 2026-01-22 22:50:39.001 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:50:39 np0005592767 nova_compute[182623]: 2026-01-22 22:50:39.001 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:50:39 np0005592767 nova_compute[182623]: 2026-01-22 22:50:39.645 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:40 np0005592767 nova_compute[182623]: 2026-01-22 22:50:40.009 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:41 np0005592767 nova_compute[182623]: 2026-01-22 22:50:41.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:41 np0005592767 nova_compute[182623]: 2026-01-22 22:50:41.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:50:42 np0005592767 nova_compute[182623]: 2026-01-22 22:50:42.176 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:42 np0005592767 nova_compute[182623]: 2026-01-22 22:50:42.870 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:42 np0005592767 nova_compute[182623]: 2026-01-22 22:50:42.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:43 np0005592767 nova_compute[182623]: 2026-01-22 22:50:43.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:44 np0005592767 podman[237955]: 2026-01-22 22:50:44.142022067 +0000 UTC m=+0.060902413 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:50:46 np0005592767 nova_compute[182623]: 2026-01-22 22:50:46.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:50:47 np0005592767 nova_compute[182623]: 2026-01-22 22:50:47.220 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:47 np0005592767 nova_compute[182623]: 2026-01-22 22:50:47.873 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:50 np0005592767 podman[237977]: 2026-01-22 22:50:50.188147818 +0000 UTC m=+0.086239050 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 17:50:50 np0005592767 podman[237976]: 2026-01-22 22:50:50.223690824 +0000 UTC m=+0.123246647 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:50:52 np0005592767 nova_compute[182623]: 2026-01-22 22:50:52.223 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:52 np0005592767 nova_compute[182623]: 2026-01-22 22:50:52.874 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:57 np0005592767 nova_compute[182623]: 2026-01-22 22:50:57.226 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:57 np0005592767 nova_compute[182623]: 2026-01-22 22:50:57.894 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:50:58 np0005592767 podman[238024]: 2026-01-22 22:50:58.128629983 +0000 UTC m=+0.048162174 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:50:58 np0005592767 podman[238025]: 2026-01-22 22:50:58.134921411 +0000 UTC m=+0.050294644 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:51:02 np0005592767 nova_compute[182623]: 2026-01-22 22:51:02.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:02 np0005592767 nova_compute[182623]: 2026-01-22 22:51:02.896 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:03 np0005592767 podman[238070]: 2026-01-22 22:51:03.129878736 +0000 UTC m=+0.055201572 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:51:07 np0005592767 nova_compute[182623]: 2026-01-22 22:51:07.232 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:07 np0005592767 nova_compute[182623]: 2026-01-22 22:51:07.897 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:12.121 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:12.122 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:12.123 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:12 np0005592767 nova_compute[182623]: 2026-01-22 22:51:12.235 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:12 np0005592767 nova_compute[182623]: 2026-01-22 22:51:12.898 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:15 np0005592767 podman[238094]: 2026-01-22 22:51:15.146975252 +0000 UTC m=+0.066064710 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 17:51:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:17.084 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:51:17 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:17.085 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:51:17 np0005592767 nova_compute[182623]: 2026-01-22 22:51:17.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:17 np0005592767 nova_compute[182623]: 2026-01-22 22:51:17.248 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:17 np0005592767 nova_compute[182623]: 2026-01-22 22:51:17.901 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:21 np0005592767 podman[238117]: 2026-01-22 22:51:21.169979649 +0000 UTC m=+0.081137836 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 22 17:51:21 np0005592767 podman[238116]: 2026-01-22 22:51:21.230410848 +0000 UTC m=+0.142049519 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Jan 22 17:51:22 np0005592767 nova_compute[182623]: 2026-01-22 22:51:22.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:22 np0005592767 nova_compute[182623]: 2026-01-22 22:51:22.903 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:24.088 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:25Z|00747|binding|INFO|Releasing lport 1318d902-d993-4189-b722-3391f796c1ad from this chassis (sb_readonly=0)
Jan 22 17:51:26 np0005592767 nova_compute[182623]: 2026-01-22 22:51:26.093 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.254 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.434 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.435 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.435 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.435 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.436 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.454 182627 INFO nova.compute.manager [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Terminating instance#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.470 182627 DEBUG nova.compute.manager [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:51:27 np0005592767 kernel: tap6328f179-0f (unregistering): left promiscuous mode
Jan 22 17:51:27 np0005592767 NetworkManager[54973]: <info>  [1769122287.4996] device (tap6328f179-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.507 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:27Z|00748|binding|INFO|Releasing lport 6328f179-0f3e-4d6d-9928-1a555225eb43 from this chassis (sb_readonly=0)
Jan 22 17:51:27 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:27Z|00749|binding|INFO|Setting lport 6328f179-0f3e-4d6d-9928-1a555225eb43 down in Southbound
Jan 22 17:51:27 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:27Z|00750|binding|INFO|Removing iface tap6328f179-0f ovn-installed in OVS
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.522 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:8a:c6 10.100.0.6'], port_security=['fa:16:3e:10:8a:c6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '59a7ea0d-16ce-4a74-9c5b-ee970251c696', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fc155a3-6ad0-4ec0-9fb6-b95cfed7cf3d d742843f-b8f4-4edd-aab0-ea2a112374a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4bfd8330-9aaa-4207-a5ef-fd1434836aa3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=6328f179-0f3e-4d6d-9928-1a555225eb43) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.526 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 6328f179-0f3e-4d6d-9928-1a555225eb43 in datapath c751c9ce-eaff-44e8-9be1-aa0c9e70af9b unbound from our chassis#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.528 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.530 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.532 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3481c0-998c-46e0-9c4f-f4b0bbdc7bf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.534 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b namespace which is not needed anymore#033[00m
Jan 22 17:51:27 np0005592767 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 22 17:51:27 np0005592767 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000a7.scope: Consumed 14.863s CPU time.
Jan 22 17:51:27 np0005592767 systemd-machined[153912]: Machine qemu-89-instance-000000a7 terminated.
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [NOTICE]   (237806) : haproxy version is 2.8.14-c23fe91
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [NOTICE]   (237806) : path to executable is /usr/sbin/haproxy
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [WARNING]  (237806) : Exiting Master process...
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [WARNING]  (237806) : Exiting Master process...
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [ALERT]    (237806) : Current worker (237808) exited with code 143 (Terminated)
Jan 22 17:51:27 np0005592767 neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b[237802]: [WARNING]  (237806) : All workers exited. Exiting... (0)
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.691 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 systemd[1]: libpod-ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe.scope: Deactivated successfully.
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.695 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 podman[238189]: 2026-01-22 22:51:27.698978299 +0000 UTC m=+0.051595151 container died ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:51:27 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe-userdata-shm.mount: Deactivated successfully.
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.732 182627 INFO nova.virt.libvirt.driver [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Instance destroyed successfully.#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.732 182627 DEBUG nova.objects.instance [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid 59a7ea0d-16ce-4a74-9c5b-ee970251c696 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:51:27 np0005592767 systemd[1]: var-lib-containers-storage-overlay-57265089bcc964df9660b73b8e9006d0c12be271d2eab87dc7c4ace01fc2af3d-merged.mount: Deactivated successfully.
Jan 22 17:51:27 np0005592767 podman[238189]: 2026-01-22 22:51:27.741839331 +0000 UTC m=+0.094456193 container cleanup ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.744 182627 DEBUG nova.virt.libvirt.vif [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2009878862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=167,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAh4l03Ed713MtSwBd/OpoUvgw2ZZfNGVsi0SV7wM77cPFt45nYMbcyVnDiB21R43mQRNOkRiPSap7J4JYwIHs2li3zkTeuIl+BTd0GCWa2RVxYz3GyzqOITgWwQhVQxNQ==',key_name='tempest-TestSecurityGroupsBasicOps-840099720',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:50:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-dv9a1l5o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:50:15Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=59a7ea0d-16ce-4a74-9c5b-ee970251c696,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.745 182627 DEBUG nova.network.os_vif_util [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "6328f179-0f3e-4d6d-9928-1a555225eb43", "address": "fa:16:3e:10:8a:c6", "network": {"id": "c751c9ce-eaff-44e8-9be1-aa0c9e70af9b", "bridge": "br-int", "label": "tempest-network-smoke--1035288216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6328f179-0f", "ovs_interfaceid": "6328f179-0f3e-4d6d-9928-1a555225eb43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.745 182627 DEBUG nova.network.os_vif_util [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.746 182627 DEBUG os_vif [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.747 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.748 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6328f179-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.749 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.751 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.753 182627 INFO os_vif [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:8a:c6,bridge_name='br-int',has_traffic_filtering=True,id=6328f179-0f3e-4d6d-9928-1a555225eb43,network=Network(c751c9ce-eaff-44e8-9be1-aa0c9e70af9b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6328f179-0f')#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.754 182627 INFO nova.virt.libvirt.driver [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Deleting instance files /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696_del#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.755 182627 INFO nova.virt.libvirt.driver [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Deletion of /var/lib/nova/instances/59a7ea0d-16ce-4a74-9c5b-ee970251c696_del complete#033[00m
Jan 22 17:51:27 np0005592767 systemd[1]: libpod-conmon-ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe.scope: Deactivated successfully.
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.833 182627 INFO nova.compute.manager [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.834 182627 DEBUG oslo.service.loopingcall [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.834 182627 DEBUG nova.compute.manager [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.834 182627 DEBUG nova.network.neutron [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:51:27 np0005592767 podman[238235]: 2026-01-22 22:51:27.837086875 +0000 UTC m=+0.060382899 container remove ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.842 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2c740ce8-dad5-4957-a181-c08d2d2ca856]: (4, ('Thu Jan 22 10:51:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b (ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe)\ned35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe\nThu Jan 22 10:51:27 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b (ed35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe)\ned35ea13684066b97e6e63f0dde0569eb6df4b333870ed6d61c9b6e53ebdaffe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.845 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9655d602-3963-4c4d-81ba-de13117b29a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.846 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc751c9ce-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 kernel: tapc751c9ce-e0: left promiscuous mode
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.859 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.862 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3444f32d-f0e3-4c73-90c6-487e1d4983c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.880 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eacea425-1d14-4931-91e5-04e3c8c72818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.882 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2894d9-f151-4384-8987-a65b6ec116a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.901 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[49e91533-45d9-4427-8d43-0ab69ab463fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580983, 'reachable_time': 35891, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238250, 'error': None, 'target': 'ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:27 np0005592767 nova_compute[182623]: 2026-01-22 22:51:27.904 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:27 np0005592767 systemd[1]: run-netns-ovnmeta\x2dc751c9ce\x2deaff\x2d44e8\x2d9be1\x2daa0c9e70af9b.mount: Deactivated successfully.
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.907 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c751c9ce-eaff-44e8-9be1-aa0c9e70af9b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:51:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:27.907 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[46f7e957-d1f1-4802-b2f8-81687e199b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.074 182627 DEBUG nova.compute.manager [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-unplugged-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.075 182627 DEBUG oslo_concurrency.lockutils [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.075 182627 DEBUG oslo_concurrency.lockutils [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.076 182627 DEBUG oslo_concurrency.lockutils [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.076 182627 DEBUG nova.compute.manager [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] No waiting events found dispatching network-vif-unplugged-6328f179-0f3e-4d6d-9928-1a555225eb43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.076 182627 DEBUG nova.compute.manager [req-0400650a-0963-493b-a8ba-3e0bac5bb1a0 req-a04e0590-9307-4351-98d5-bd33fabd9d94 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-unplugged-6328f179-0f3e-4d6d-9928-1a555225eb43 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.512 182627 DEBUG nova.compute.manager [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.512 182627 DEBUG nova.compute.manager [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing instance network info cache due to event network-changed-6328f179-0f3e-4d6d-9928-1a555225eb43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.513 182627 DEBUG oslo_concurrency.lockutils [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.513 182627 DEBUG oslo_concurrency.lockutils [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.513 182627 DEBUG nova.network.neutron [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Refreshing network info cache for port 6328f179-0f3e-4d6d-9928-1a555225eb43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.691 182627 INFO nova.network.neutron [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Port 6328f179-0f3e-4d6d-9928-1a555225eb43 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.692 182627 DEBUG nova.network.neutron [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.708 182627 DEBUG oslo_concurrency.lockutils [req-7591dd70-6276-4464-82f3-585c275a7e3e req-21fd4e8b-7655-4742-96ee-c0018db3ac5d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-59a7ea0d-16ce-4a74-9c5b-ee970251c696" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.721 182627 DEBUG nova.network.neutron [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.740 182627 INFO nova.compute.manager [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.814 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.815 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.877 182627 DEBUG nova.compute.provider_tree [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.887 182627 DEBUG nova.scheduler.client.report [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.905 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.926 182627 INFO nova.scheduler.client.report [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance 59a7ea0d-16ce-4a74-9c5b-ee970251c696#033[00m
Jan 22 17:51:28 np0005592767 nova_compute[182623]: 2026-01-22 22:51:28.980 182627 DEBUG oslo_concurrency.lockutils [None req-cf022f9b-80d7-4277-a088-a1799f5c1bcc 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:29 np0005592767 podman[238253]: 2026-01-22 22:51:29.146247126 +0000 UTC m=+0.058793744 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 17:51:29 np0005592767 podman[238254]: 2026-01-22 22:51:29.147004078 +0000 UTC m=+0.059168965 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.190 182627 DEBUG nova.compute.manager [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.190 182627 DEBUG oslo_concurrency.lockutils [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.190 182627 DEBUG oslo_concurrency.lockutils [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.190 182627 DEBUG oslo_concurrency.lockutils [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "59a7ea0d-16ce-4a74-9c5b-ee970251c696-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.191 182627 DEBUG nova.compute.manager [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] No waiting events found dispatching network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.191 182627 WARNING nova.compute.manager [req-9ca4f8ae-94c3-4763-8b2b-a1fcb96931e2 req-8823c6a3-2825-43c4-8f8a-cea6806747e7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received unexpected event network-vif-plugged-6328f179-0f3e-4d6d-9928-1a555225eb43 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:51:30 np0005592767 nova_compute[182623]: 2026-01-22 22:51:30.621 182627 DEBUG nova.compute.manager [req-630a8c91-1314-47dd-ba6f-95d626e6dd8d req-a6a6dfbb-598e-40db-af86-34b91269ba5e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Received event network-vif-deleted-6328f179-0f3e-4d6d-9928-1a555225eb43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:32 np0005592767 nova_compute[182623]: 2026-01-22 22:51:32.752 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:32 np0005592767 nova_compute[182623]: 2026-01-22 22:51:32.907 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:33 np0005592767 nova_compute[182623]: 2026-01-22 22:51:33.171 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:33 np0005592767 nova_compute[182623]: 2026-01-22 22:51:33.337 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:33 np0005592767 nova_compute[182623]: 2026-01-22 22:51:33.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:34 np0005592767 podman[238294]: 2026-01-22 22:51:34.143313284 +0000 UTC m=+0.058189277 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:51:34 np0005592767 nova_compute[182623]: 2026-01-22 22:51:34.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:34 np0005592767 nova_compute[182623]: 2026-01-22 22:51:34.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:35 np0005592767 nova_compute[182623]: 2026-01-22 22:51:35.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:35 np0005592767 nova_compute[182623]: 2026-01-22 22:51:35.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:51:35 np0005592767 nova_compute[182623]: 2026-01-22 22:51:35.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:51:36 np0005592767 nova_compute[182623]: 2026-01-22 22:51:36.044 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.908 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:37 np0005592767 nova_compute[182623]: 2026-01-22 22:51:37.920 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.161 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.162 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5694MB free_disk=73.05243301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.162 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.162 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.225 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.226 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.245 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.260 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.261 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.279 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.306 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.329 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.351 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.381 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:51:38 np0005592767 nova_compute[182623]: 2026-01-22 22:51:38.382 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:40 np0005592767 nova_compute[182623]: 2026-01-22 22:51:40.381 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:41 np0005592767 nova_compute[182623]: 2026-01-22 22:51:41.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:41 np0005592767 nova_compute[182623]: 2026-01-22 22:51:41.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:51:42 np0005592767 nova_compute[182623]: 2026-01-22 22:51:42.723 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122287.7219634, 59a7ea0d-16ce-4a74-9c5b-ee970251c696 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:51:42 np0005592767 nova_compute[182623]: 2026-01-22 22:51:42.724 182627 INFO nova.compute.manager [-] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:51:42 np0005592767 nova_compute[182623]: 2026-01-22 22:51:42.742 182627 DEBUG nova.compute.manager [None req-a0a3a72d-8143-4c03-823f-7b0a1c259b20 - - - - - -] [instance: 59a7ea0d-16ce-4a74-9c5b-ee970251c696] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:51:42 np0005592767 nova_compute[182623]: 2026-01-22 22:51:42.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:42 np0005592767 nova_compute[182623]: 2026-01-22 22:51:42.910 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:43 np0005592767 nova_compute[182623]: 2026-01-22 22:51:43.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:46 np0005592767 podman[238319]: 2026-01-22 22:51:46.159752438 +0000 UTC m=+0.075355172 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:51:47 np0005592767 nova_compute[182623]: 2026-01-22 22:51:47.766 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:47 np0005592767 nova_compute[182623]: 2026-01-22 22:51:47.912 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:48 np0005592767 nova_compute[182623]: 2026-01-22 22:51:48.890 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:48 np0005592767 nova_compute[182623]: 2026-01-22 22:51:48.908 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.511 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.512 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.543 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.684 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.685 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.700 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.700 182627 INFO nova.compute.claims [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.831 182627 DEBUG nova.compute.provider_tree [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.846 182627 DEBUG nova.scheduler.client.report [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.870 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.870 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.926 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.926 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.942 182627 INFO nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:51:49 np0005592767 nova_compute[182623]: 2026-01-22 22:51:49.961 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.091 182627 DEBUG nova.policy [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.095 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.096 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.097 182627 INFO nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Creating image(s)#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.097 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.097 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.098 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.114 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.191 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.192 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.193 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.209 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.278 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.279 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.315 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.316 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.317 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.369 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.370 182627 DEBUG nova.virt.disk.api [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.371 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.424 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.425 182627 DEBUG nova.virt.disk.api [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.426 182627 DEBUG nova.objects.instance [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid 16d58120-504d-4493-a805-fd0f148ce748 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.446 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.446 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Ensure instance console log exists: /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.447 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.447 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:50 np0005592767 nova_compute[182623]: 2026-01-22 22:51:50.447 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:51 np0005592767 nova_compute[182623]: 2026-01-22 22:51:51.141 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Successfully created port: bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:51:52 np0005592767 podman[238355]: 2026-01-22 22:51:52.152971433 +0000 UTC m=+0.066317877 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 17:51:52 np0005592767 podman[238354]: 2026-01-22 22:51:52.195168507 +0000 UTC m=+0.107602825 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.246 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Successfully updated port: bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.267 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.268 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.268 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.408 182627 DEBUG nova.compute.manager [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.409 182627 DEBUG nova.compute.manager [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing instance network info cache due to event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.409 182627 DEBUG oslo_concurrency.lockutils [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.477 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.768 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:52 np0005592767 nova_compute[182623]: 2026-01-22 22:51:52.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.838 182627 DEBUG nova.network.neutron [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.856 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.856 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Instance network_info: |[{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.857 182627 DEBUG oslo_concurrency.lockutils [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.857 182627 DEBUG nova.network.neutron [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.860 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Start _get_guest_xml network_info=[{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.863 182627 WARNING nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.872 182627 DEBUG nova.virt.libvirt.host [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.873 182627 DEBUG nova.virt.libvirt.host [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.878 182627 DEBUG nova.virt.libvirt.host [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.878 182627 DEBUG nova.virt.libvirt.host [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.879 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.880 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.880 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.880 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.881 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.881 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.881 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.882 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.882 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.882 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.883 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.883 182627 DEBUG nova.virt.hardware [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.886 182627 DEBUG nova.virt.libvirt.vif [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=171,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-r2019l3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:49Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=16d58120-504d-4493-a805-fd0f148ce748,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.887 182627 DEBUG nova.network.os_vif_util [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.887 182627 DEBUG nova.network.os_vif_util [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.888 182627 DEBUG nova.objects.instance [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid 16d58120-504d-4493-a805-fd0f148ce748 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.900 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <uuid>16d58120-504d-4493-a805-fd0f148ce748</uuid>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <name>instance-000000ab</name>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970</nova:name>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:51:53</nova:creationTime>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        <nova:port uuid="bea3ca79-ba9e-4ec4-b852-24da44f1a2fe">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="serial">16d58120-504d-4493-a805-fd0f148ce748</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="uuid">16d58120-504d-4493-a805-fd0f148ce748</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.config"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:60:6d:78"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <target dev="tapbea3ca79-ba"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/console.log" append="off"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:51:53 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:51:53 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:51:53 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:51:53 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.901 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Preparing to wait for external event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.901 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.901 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.902 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.903 182627 DEBUG nova.virt.libvirt.vif [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:51:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=171,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-r2019l3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:51:49Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=16d58120-504d-4493-a805-fd0f148ce748,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.903 182627 DEBUG nova.network.os_vif_util [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.904 182627 DEBUG nova.network.os_vif_util [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.904 182627 DEBUG os_vif [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.905 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.905 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.906 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.909 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.909 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbea3ca79-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.909 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbea3ca79-ba, col_values=(('external_ids', {'iface-id': 'bea3ca79-ba9e-4ec4-b852-24da44f1a2fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:6d:78', 'vm-uuid': '16d58120-504d-4493-a805-fd0f148ce748'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.947 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:53 np0005592767 NetworkManager[54973]: <info>  [1769122313.9492] manager: (tapbea3ca79-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.950 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.953 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.954 182627 INFO os_vif [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba')#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.992 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.992 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.992 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:60:6d:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:51:53 np0005592767 nova_compute[182623]: 2026-01-22 22:51:53.993 182627 INFO nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Using config drive#033[00m
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.514 182627 INFO nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Creating config drive at /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.config#033[00m
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.523 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsilap84t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.654 182627 DEBUG oslo_concurrency.processutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsilap84t" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:51:54 np0005592767 kernel: tapbea3ca79-ba: entered promiscuous mode
Jan 22 17:51:54 np0005592767 NetworkManager[54973]: <info>  [1769122314.7256] manager: (tapbea3ca79-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/350)
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.727 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:54Z|00751|binding|INFO|Claiming lport bea3ca79-ba9e-4ec4-b852-24da44f1a2fe for this chassis.
Jan 22 17:51:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:54Z|00752|binding|INFO|bea3ca79-ba9e-4ec4-b852-24da44f1a2fe: Claiming fa:16:3e:60:6d:78 10.100.0.11
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.729 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:54 np0005592767 systemd-udevd[238417]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.763 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:6d:78 10.100.0.11'], port_security=['fa:16:3e:60:6d:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '16d58120-504d-4493-a805-fd0f148ce748', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '26a6c206-3f87-4a1c-b08e-7e4fba0abf68 a7c55ab8-6203-4af9-8fdd-4cbb6d4c41f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8674c23-6131-45a3-8d73-f4e7ea0ce5e8, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.765 104135 INFO neutron.agent.ovn.metadata.agent [-] Port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe in datapath e3bdfa79-ec53-4a7f-b83a-e5086bff52fd bound to our chassis#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.768 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bdfa79-ec53-4a7f-b83a-e5086bff52fd#033[00m
Jan 22 17:51:54 np0005592767 NetworkManager[54973]: <info>  [1769122314.7746] device (tapbea3ca79-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:51:54 np0005592767 NetworkManager[54973]: <info>  [1769122314.7754] device (tapbea3ca79-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.784 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[16be98e7-dd9c-4e10-beaa-5f3eaff7587d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.785 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3bdfa79-e1 in ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:51:54 np0005592767 systemd-machined[153912]: New machine qemu-90-instance-000000ab.
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.787 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3bdfa79-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.787 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[329957f2-27ba-4856-b30a-2ca922609c75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.789 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd1a462-77bd-4f8c-bfac-a15c78c2134c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.793 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:54Z|00753|binding|INFO|Setting lport bea3ca79-ba9e-4ec4-b852-24da44f1a2fe ovn-installed in OVS
Jan 22 17:51:54 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:54Z|00754|binding|INFO|Setting lport bea3ca79-ba9e-4ec4-b852-24da44f1a2fe up in Southbound
Jan 22 17:51:54 np0005592767 nova_compute[182623]: 2026-01-22 22:51:54.799 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:54 np0005592767 systemd[1]: Started Virtual Machine qemu-90-instance-000000ab.
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.802 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[96be24f1-9dfd-465d-a7e3-0fdef03a7e37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.826 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b89d491a-2a65-48e5-af73-16500dca6678]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.863 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[381e8a08-0bf3-4d48-ba56-eb0190aac2e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 NetworkManager[54973]: <info>  [1769122314.8714] manager: (tape3bdfa79-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/351)
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.871 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5a893d4d-635f-4f25-b4a2-e40edbc41ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.916 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[f1388c8a-48f9-4f70-bd25-cc6ca8eb0688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.920 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1bfb229b-33f3-4128-b31a-a22d37e7ab9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 NetworkManager[54973]: <info>  [1769122314.9517] device (tape3bdfa79-e0): carrier: link connected
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.957 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d5666b3a-19b7-4c58-98bc-a3f6da98947c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:54.983 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8b43c1-d93b-479c-99a6-03ba2a180052]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bdfa79-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:a8:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591153, 'reachable_time': 24072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238453, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.006 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[001af21c-0e81-4b7c-85be-75bb5e52349d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:a8c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591153, 'tstamp': 591153}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238454, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.033 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18ab0590-6285-40fa-a8d0-7bcae2163846]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bdfa79-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:a8:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591153, 'reachable_time': 24072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238455, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.066 182627 DEBUG nova.network.neutron [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updated VIF entry in instance network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.067 182627 DEBUG nova.network.neutron [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6d26aa7b-f556-4b09-b07d-0ebb98754ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.080 182627 DEBUG oslo_concurrency.lockutils [req-78130e71-da2a-40d2-8d42-395e5a183958 req-2587cd08-5059-4596-b30c-8c1887dca329 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.152 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[24f754d2-9f15-413d-8754-09bad49c635d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.154 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bdfa79-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.154 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.154 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bdfa79-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.157 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:55 np0005592767 kernel: tape3bdfa79-e0: entered promiscuous mode
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:55 np0005592767 NetworkManager[54973]: <info>  [1769122315.1605] manager: (tape3bdfa79-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.162 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bdfa79-e0, col_values=(('external_ids', {'iface-id': 'b2aeb8c1-7e46-42c5-86c7-fddcc6060da1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:55 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:55Z|00755|binding|INFO|Releasing lport b2aeb8c1-7e46-42c5-86c7-fddcc6060da1 from this chassis (sb_readonly=0)
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.166 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.167 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e879eb84-9b41-482b-b54b-86e2b5019acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.168 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.pid.haproxy
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID e3bdfa79-ec53-4a7f-b83a-e5086bff52fd
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:51:55 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:51:55.169 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'env', 'PROCESS_TAG=haproxy-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3bdfa79-ec53-4a7f-b83a-e5086bff52fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.175 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.383 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122315.3828282, 16d58120-504d-4493-a805-fd0f148ce748 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.384 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] VM Started (Lifecycle Event)#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.413 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.417 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122315.3841803, 16d58120-504d-4493-a805-fd0f148ce748 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.417 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.450 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.454 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.484 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.556 182627 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.557 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.558 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.559 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.559 182627 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Processing event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.560 182627 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.560 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.561 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.561 182627 DEBUG oslo_concurrency.lockutils [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.562 182627 DEBUG nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] No waiting events found dispatching network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.562 182627 WARNING nova.compute.manager [req-1789576d-d2de-44f9-bfde-c4e773c92715 req-edf16741-b4ea-47e0-8687-942f24f7bad1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received unexpected event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.563 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.568 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122315.567944, 16d58120-504d-4493-a805-fd0f148ce748 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.569 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:51:55 np0005592767 podman[238494]: 2026-01-22 22:51:55.573344192 +0000 UTC m=+0.061373207 container create c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.572 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.579 182627 INFO nova.virt.libvirt.driver [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Instance spawned successfully.#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.579 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.600 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.607 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.610 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.611 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.611 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.612 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.612 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.613 182627 DEBUG nova.virt.libvirt.driver [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:51:55 np0005592767 systemd[1]: Started libpod-conmon-c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5.scope.
Jan 22 17:51:55 np0005592767 podman[238494]: 2026-01-22 22:51:55.535055439 +0000 UTC m=+0.023084544 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.641 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:51:55 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:51:55 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01cb9c7037f382bcb486ebf964c75c4b7a85b972c2dc114c755bf1a10e87e69a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:51:55 np0005592767 podman[238494]: 2026-01-22 22:51:55.666779775 +0000 UTC m=+0.154808810 container init c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:51:55 np0005592767 podman[238494]: 2026-01-22 22:51:55.671938131 +0000 UTC m=+0.159967136 container start c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:51:55 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [NOTICE]   (238513) : New worker (238515) forked
Jan 22 17:51:55 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [NOTICE]   (238513) : Loading success.
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.707 182627 INFO nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Took 5.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.707 182627 DEBUG nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.809 182627 INFO nova.compute.manager [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Took 6.17 seconds to build instance.#033[00m
Jan 22 17:51:55 np0005592767 nova_compute[182623]: 2026-01-22 22:51:55.832 182627 DEBUG oslo_concurrency.lockutils [None req-9a684034-bb42-4b0e-8b88-bd522933f103 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:51:57 np0005592767 nova_compute[182623]: 2026-01-22 22:51:57.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:58 np0005592767 nova_compute[182623]: 2026-01-22 22:51:58.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:59 np0005592767 NetworkManager[54973]: <info>  [1769122319.5327] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 22 17:51:59 np0005592767 nova_compute[182623]: 2026-01-22 22:51:59.532 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:59 np0005592767 NetworkManager[54973]: <info>  [1769122319.5334] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 22 17:51:59 np0005592767 nova_compute[182623]: 2026-01-22 22:51:59.648 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:51:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:51:59Z|00756|binding|INFO|Releasing lport b2aeb8c1-7e46-42c5-86c7-fddcc6060da1 from this chassis (sb_readonly=0)
Jan 22 17:51:59 np0005592767 nova_compute[182623]: 2026-01-22 22:51:59.661 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:00 np0005592767 podman[238526]: 2026-01-22 22:52:00.154261647 +0000 UTC m=+0.070957878 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:52:00 np0005592767 podman[238525]: 2026-01-22 22:52:00.175143438 +0000 UTC m=+0.097091208 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Jan 22 17:52:00 np0005592767 nova_compute[182623]: 2026-01-22 22:52:00.970 182627 DEBUG nova.compute.manager [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:52:00 np0005592767 nova_compute[182623]: 2026-01-22 22:52:00.971 182627 DEBUG nova.compute.manager [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing instance network info cache due to event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:52:00 np0005592767 nova_compute[182623]: 2026-01-22 22:52:00.971 182627 DEBUG oslo_concurrency.lockutils [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:52:00 np0005592767 nova_compute[182623]: 2026-01-22 22:52:00.972 182627 DEBUG oslo_concurrency.lockutils [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:52:00 np0005592767 nova_compute[182623]: 2026-01-22 22:52:00.972 182627 DEBUG nova.network.neutron [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:52:02 np0005592767 nova_compute[182623]: 2026-01-22 22:52:02.918 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:03 np0005592767 nova_compute[182623]: 2026-01-22 22:52:03.984 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:04 np0005592767 nova_compute[182623]: 2026-01-22 22:52:04.989 182627 DEBUG nova.network.neutron [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updated VIF entry in instance network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:52:04 np0005592767 nova_compute[182623]: 2026-01-22 22:52:04.990 182627 DEBUG nova.network.neutron [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:52:05 np0005592767 nova_compute[182623]: 2026-01-22 22:52:05.011 182627 DEBUG oslo_concurrency.lockutils [req-9df6ea43-0b9c-4943-becd-2f8cb9c73c73 req-2d7fe1cc-d9de-4c3c-961f-bb4c5c61a01c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:52:05 np0005592767 podman[238566]: 2026-01-22 22:52:05.162060388 +0000 UTC m=+0.071515124 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.328 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '16d58120-504d-4493-a805-fd0f148ce748', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ab', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'user_id': '57cadc74575048b298f2ab431b92531e', 'hostId': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.329 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.329 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>]
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.332 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 16d58120-504d-4493-a805-fd0f148ce748 / tapbea3ca79-ba inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.332 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1490a6f7-a687-47a6-8aba-82e38d1c6f25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.330440', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa5aa19e-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'b51aabbec624d95d787a5827b3a30656687c3227595537f95b49ef985a58788c'}]}, 'timestamp': '2026-01-22 22:52:07.333650', '_unique_id': '768769c608aa4329a4a0e65c17d3c25a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.337 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.337 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.incoming.bytes volume: 844 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e98130c9-d37f-41a2-b5b0-4458b62361dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 844, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.337169', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa5b3f78-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': '846ff0f454162fb4ace32121db415b03a0a3497fa17fa9214622df71e3bd4330'}]}, 'timestamp': '2026-01-22 22:52:07.337472', '_unique_id': 'c4ea91aad55c455fb75159bdeed97882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d6d580b-829d-4156-aa3b-490c474854bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.339029', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa5b8816-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'fbdc9a728a63f50b0186d1c2e4b893aae4f7146de6112af8989da69373c33b46'}]}, 'timestamp': '2026-01-22 22:52:07.339332', '_unique_id': '37727807f32f47c9a01c7d00148a0c76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.341 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.341 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>]
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.342 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.351 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.352 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17583af6-299f-4674-b2ee-c447c4e1b487', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.342508', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa5d8774-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': '6d0813b85c136138bd8206dab5a9677f35e794a5dce00afa4543f92127395a02'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.342508', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa5d966a-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': 'b52cc9c5c3988ecdab374f3adbbc87758f30d22a296614d6be9ec96d285e174f'}]}, 'timestamp': '2026-01-22 22:52:07.352845', '_unique_id': '136ff1340f284e4795d0144420685616'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.355 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.370 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e0f93fa-75d2-40e3-8d6a-61bfa8e498cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748', 'timestamp': '2026-01-22T22:52:07.355384', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'fa60684a-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.005506361, 'message_signature': 'fd732a2e248c9c9e6eed1f720790f7dd65741bf8a9bc6311a7a5a300c22fd393'}]}, 'timestamp': '2026-01-22 22:52:07.371438', '_unique_id': 'b0610d6b4f4e446580c5133adc60fbaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.372 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.402 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.latency volume: 151606715 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.403 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.latency volume: 17248370 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08108c38-bda2-4fc9-9198-b042c931f93f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 151606715, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.374362', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa654432-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '37ab422420ab22d8487758a54b582a0dd20b6e016e86621d2a12335dedb03c49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17248370, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.374362', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa655a80-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': 'ffeab46571cb806376888740a233f0dd7da11f8145865e066246d5b039cc4987'}]}, 'timestamp': '2026-01-22 22:52:07.404077', '_unique_id': 'c241f42c9aa44e91851c3d7f74d91d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.407 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6e3065b-9a54-4d9b-be68-bd6e25f91683', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.407256', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa65f882-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'd59135aa9c098eb4a0c413abf7ae0af0a046853e686087368f2504381fb0a3fb'}]}, 'timestamp': '2026-01-22 22:52:07.407879', '_unique_id': 'ea3dfc1f1b42448686dc746e1d828261'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.410 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.410 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.410 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be78824-ea50-4f05-976e-01750b8f0ddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.410405', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa666ef2-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': '7c1157d5a2d3bd3740ccbd72a0701b79f41718c6d7eebed23127da831fbc2b6f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.410405', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa668022-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': '0e76d2b3e226cff6675f73b21ed12ea384bf60b2f0a3b9ea5ba522978143ba67'}]}, 'timestamp': '2026-01-22 22:52:07.411341', '_unique_id': 'bd3eb4bfee754a6d95a952da95162acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.413 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.414 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>]
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.414 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ecb25bd-61de-4815-9f9a-edb9040354b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.414721', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa671762-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'c3212b28001ffd1de3eecd2b658509594d9820d29852e88d75d660cf4edc3457'}]}, 'timestamp': '2026-01-22 22:52:07.415248', '_unique_id': '01d530930625486ebcf6b9d99bba94f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.417 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.417 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.418 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc875ad4-4fc7-4345-a765-3621f1909450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.417806', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa679020-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '7502400fd1684dee8c2352c318dbb7172f23d4ec74e82c28927db573b138ff2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.417806', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa67a65a-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '86220ef32e5b7ecb88a4162b7fffd8b419cc9e72e87126b799618b62d84b43cb'}]}, 'timestamp': '2026-01-22 22:52:07.418849', '_unique_id': 'd9fbc790415a4bd38c48a0e63affb083'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.421 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.421 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.requests volume: 292 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.421 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad09e032-4a1a-44ae-9855-789d04d29bfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 292, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.421375', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa681f72-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': 'd038b0cb6b72047a96fbf691be60220d8172bbd439b62364fc2ce97f1eff836a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.421375', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa68326e-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '611f12223f933b9532f6d6033fa431341928e044cbe61ad9922e4c19954f037c'}]}, 'timestamp': '2026-01-22 22:52:07.422429', '_unique_id': 'd752248cac994ea8adc7d91e24f01e14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.424 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.425 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21fba0e7-5146-4611-b335-11f0dbe3be58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.425071', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa68ae92-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'f742760f3e81ddea0a6459b395564b342017b16d650bdebe033fa28bb8fa662b'}]}, 'timestamp': '2026-01-22 22:52:07.425681', '_unique_id': '5c153634773e4e3792f4e9721cbc3a08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.427 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.428 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9b16242-433c-482a-ab7a-1ec493d3fbeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.428088', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa6923a4-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'd1bc75929fd2e2b6e9ae88a679aa627f5d3298249eac135131ebc9a775196e01'}]}, 'timestamp': '2026-01-22 22:52:07.428638', '_unique_id': 'a2bb416b7afb4dd0bc6cf337f6785947'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.430 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.431 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4264e4be-3022-4d75-bd66-8494cbc939e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.431027', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa6995e6-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': '56f59965530af4e7ecb7781186c72c4e94d2e8bc2c7ed23967ce4224fd306ecb'}]}, 'timestamp': '2026-01-22 22:52:07.431564', '_unique_id': '76048406cd7342d59da9352d6b0db356'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.433 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.434 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.latency volume: 1679004624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.434 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3cb4032-a3a6-4a3f-815c-d2ec0b98217c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1679004624, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.434100', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa6a0de6-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '59923bbe9a190ff29c24955c7c90bad582879ac85416025e29a34234278da9a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.434100', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa6a1f20-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': 'e5de05d264e94d1b4010342d645f704d9e8133e05c93e1f9b52370fa85023d98'}]}, 'timestamp': '2026-01-22 22:52:07.435040', '_unique_id': '6310915f7f6d4b648b20be919b0d22d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.436 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.437 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.437 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/cpu volume: 10540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f20a752f-9fc7-4504-826f-d8f86697e73d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10540000000, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748', 'timestamp': '2026-01-22T22:52:07.437812', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'fa6aa210-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.005506361, 'message_signature': '1047e3d09869f08ffb7dbdecb56d5adbc9297bb72abd4001c92413bb827a942b'}]}, 'timestamp': '2026-01-22 22:52:07.438530', '_unique_id': 'b1bd73c67483483689417f1a753eb986'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.441 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05ec8dc6-85c1-4783-936a-14b94c1071c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.441366', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa6b2a8c-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': 'eddc162c82e4319a7b924dd08a9cbe3f0309e6243e288657cf3611c507a89003'}]}, 'timestamp': '2026-01-22 22:52:07.441923', '_unique_id': '89aabed7457a4313910175f43c284f4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.443 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735c4552-f53a-41d5-967f-9f32f045f654', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000ab-16d58120-504d-4493-a805-fd0f148ce748-tapbea3ca79-ba', 'timestamp': '2026-01-22T22:52:07.443959', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'tapbea3ca79-ba', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:6d:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbea3ca79-ba'}, 'message_id': 'fa6b8a2c-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.965306854, 'message_signature': '2f8403b3006680232d1b2ab1ee19e40d842b80563f4a958bdc7f03309ce3011b'}]}, 'timestamp': '2026-01-22 22:52:07.444317', '_unique_id': '62afd12409e14ce9afcccf8c6ce9a4fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.445 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.446 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.bytes volume: 30312960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.446 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '249d3dbe-a483-4d35-a93b-7769c2e1f747', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30312960, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.446074', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa6bdda6-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '79531a0e10bf5580d61a9d3ef7465b66d6296ff150a950932c68836dc15a564d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.446074', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa6be990-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '17f24410b0cb7b4857c88ba4cabbf3e4484cd0e0d5d5ecee214a3a0346924491'}]}, 'timestamp': '2026-01-22 22:52:07.446709', '_unique_id': '72eb82b6538b4eeab42a5a31be5dc974'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.447 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.448 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.448 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.448 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a75d99f-172d-40e0-a8e7-71957366acb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.448538', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa6c3db4-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': '001595dfb4d5ee64b9eb89fbbbf76add4347d342ecb01594010064aa49145650'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.448538', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa6c4962-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5923.977422977, 'message_signature': '55e512531e3691a2289cab2a0bd956411fbbcd570107b647fd4ba7ab960b6ec7'}]}, 'timestamp': '2026-01-22 22:52:07.449228', '_unique_id': '83ec99a0c3674edd8ce02919766265ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.449 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.450 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.451 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970>]
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.451 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.451 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.451 12 DEBUG ceilometer.compute.pollsters [-] 16d58120-504d-4493-a805-fd0f148ce748/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bc9e755-60dd-4e47-a3b9-2605e63c237b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-vda', 'timestamp': '2026-01-22T22:52:07.451389', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fa6cac18-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': 'ee480ec89ea0cfc4d4f62bf99917dc38c8f216acbfa1975bc9eaf0c94f243515'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '16d58120-504d-4493-a805-fd0f148ce748-sda', 'timestamp': '2026-01-22T22:52:07.451389', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970', 'name': 'instance-000000ab', 'instance_id': '16d58120-504d-4493-a805-fd0f148ce748', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fa6cb780-f7e4-11f0-a43a-fa163ed01feb', 'monotonic_time': 5924.009270208, 'message_signature': '69070a78d5e4b4d3f289afd563f3abed5740db65c0cceb81f8523984ed305614'}]}, 'timestamp': '2026-01-22 22:52:07.451977', '_unique_id': '39d1b2174c4f401e94078b81be42b824'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:52:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:52:07.452 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:52:07 np0005592767 nova_compute[182623]: 2026-01-22 22:52:07.920 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:07Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:6d:78 10.100.0.11
Jan 22 17:52:07 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:07Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:6d:78 10.100.0.11
Jan 22 17:52:08 np0005592767 nova_compute[182623]: 2026-01-22 22:52:08.987 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:11Z|00757|binding|INFO|Releasing lport b2aeb8c1-7e46-42c5-86c7-fddcc6060da1 from this chassis (sb_readonly=0)
Jan 22 17:52:11 np0005592767 nova_compute[182623]: 2026-01-22 22:52:11.225 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:12.123 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:12.126 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:12 np0005592767 nova_compute[182623]: 2026-01-22 22:52:12.922 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:13 np0005592767 nova_compute[182623]: 2026-01-22 22:52:13.989 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:17 np0005592767 podman[238615]: 2026-01-22 22:52:17.166930856 +0000 UTC m=+0.072805580 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 17:52:17 np0005592767 nova_compute[182623]: 2026-01-22 22:52:17.924 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:18 np0005592767 nova_compute[182623]: 2026-01-22 22:52:18.992 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:21.079 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:52:21 np0005592767 nova_compute[182623]: 2026-01-22 22:52:21.080 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:21 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:21.080 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:52:22 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:22.082 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:52:22 np0005592767 nova_compute[182623]: 2026-01-22 22:52:22.165 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:22 np0005592767 nova_compute[182623]: 2026-01-22 22:52:22.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:23 np0005592767 podman[238637]: 2026-01-22 22:52:23.139101016 +0000 UTC m=+0.059992658 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:52:23 np0005592767 podman[238636]: 2026-01-22 22:52:23.187246758 +0000 UTC m=+0.111066043 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 17:52:24 np0005592767 nova_compute[182623]: 2026-01-22 22:52:24.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:24 np0005592767 nova_compute[182623]: 2026-01-22 22:52:24.110 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:26.190 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8::f816:3eff:fe8c:995a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7d34b7c4-140b-4dda-ad27-aa5734d5709c) old=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:52:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:26.192 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7d34b7c4-140b-4dda-ad27-aa5734d5709c in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 updated#033[00m
Jan 22 17:52:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:26.195 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:52:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:26.197 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6f1195-75cf-4ae2-84bc-6c8a2eaacbcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:27 np0005592767 nova_compute[182623]: 2026-01-22 22:52:27.929 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:29 np0005592767 nova_compute[182623]: 2026-01-22 22:52:29.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:31 np0005592767 podman[238681]: 2026-01-22 22:52:31.130934693 +0000 UTC m=+0.055280075 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:52:31 np0005592767 podman[238682]: 2026-01-22 22:52:31.156205518 +0000 UTC m=+0.066166673 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:52:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:31.640 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8:0:1:f816:3eff:fe8c:995a 2001:db8::f816:3eff:fe8c:995a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe8c:995a/64 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7d34b7c4-140b-4dda-ad27-aa5734d5709c) old=Port_Binding(mac=['fa:16:3e:8c:99:5a 10.100.0.2 2001:db8::f816:3eff:fe8c:995a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8c:995a/64', 'neutron:device_id': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:52:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:31.641 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7d34b7c4-140b-4dda-ad27-aa5734d5709c in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 updated#033[00m
Jan 22 17:52:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:31.642 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:52:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:31.643 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[11b6828c-5189-4baa-b10b-afed53d379d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:32 np0005592767 nova_compute[182623]: 2026-01-22 22:52:32.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:33 np0005592767 nova_compute[182623]: 2026-01-22 22:52:33.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:34 np0005592767 nova_compute[182623]: 2026-01-22 22:52:34.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:35 np0005592767 nova_compute[182623]: 2026-01-22 22:52:35.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:35 np0005592767 nova_compute[182623]: 2026-01-22 22:52:35.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:52:35 np0005592767 nova_compute[182623]: 2026-01-22 22:52:35.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:52:36 np0005592767 podman[238725]: 2026-01-22 22:52:36.197875685 +0000 UTC m=+0.107734618 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:52:36 np0005592767 nova_compute[182623]: 2026-01-22 22:52:36.264 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:52:36 np0005592767 nova_compute[182623]: 2026-01-22 22:52:36.264 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:52:36 np0005592767 nova_compute[182623]: 2026-01-22 22:52:36.264 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:52:36 np0005592767 nova_compute[182623]: 2026-01-22 22:52:36.264 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 16d58120-504d-4493-a805-fd0f148ce748 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.845 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.861 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.861 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.862 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.862 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:37 np0005592767 nova_compute[182623]: 2026-01-22 22:52:37.932 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.921 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.921 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:52:38 np0005592767 nova_compute[182623]: 2026-01-22 22:52:38.992 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.069 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.070 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.091 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.122 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.274 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.275 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5535MB free_disk=73.02371597290039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.276 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.276 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.585 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 16d58120-504d-4493-a805-fd0f148ce748 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.586 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.586 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.641 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.659 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.680 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:52:39 np0005592767 nova_compute[182623]: 2026-01-22 22:52:39.681 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:41 np0005592767 nova_compute[182623]: 2026-01-22 22:52:41.681 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:42 np0005592767 nova_compute[182623]: 2026-01-22 22:52:42.935 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:43 np0005592767 nova_compute[182623]: 2026-01-22 22:52:43.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:43 np0005592767 nova_compute[182623]: 2026-01-22 22:52:43.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:43 np0005592767 nova_compute[182623]: 2026-01-22 22:52:43.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:52:44 np0005592767 nova_compute[182623]: 2026-01-22 22:52:44.135 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:47 np0005592767 nova_compute[182623]: 2026-01-22 22:52:47.939 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:48 np0005592767 podman[238754]: 2026-01-22 22:52:48.165787021 +0000 UTC m=+0.075683342 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 17:52:49 np0005592767 nova_compute[182623]: 2026-01-22 22:52:49.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:50 np0005592767 nova_compute[182623]: 2026-01-22 22:52:50.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:52:52 np0005592767 nova_compute[182623]: 2026-01-22 22:52:52.946 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.882 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.882 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.882 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.883 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.883 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.893 182627 INFO nova.compute.manager [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Terminating instance#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.901 182627 DEBUG nova.compute.manager [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:52:53 np0005592767 kernel: tapbea3ca79-ba (unregistering): left promiscuous mode
Jan 22 17:52:53 np0005592767 NetworkManager[54973]: <info>  [1769122373.9244] device (tapbea3ca79-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.931 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:53Z|00758|binding|INFO|Releasing lport bea3ca79-ba9e-4ec4-b852-24da44f1a2fe from this chassis (sb_readonly=0)
Jan 22 17:52:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:53Z|00759|binding|INFO|Setting lport bea3ca79-ba9e-4ec4-b852-24da44f1a2fe down in Southbound
Jan 22 17:52:53 np0005592767 ovn_controller[94769]: 2026-01-22T22:52:53Z|00760|binding|INFO|Removing iface tapbea3ca79-ba ovn-installed in OVS
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.933 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:53.938 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:6d:78 10.100.0.11'], port_security=['fa:16:3e:60:6d:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '16d58120-504d-4493-a805-fd0f148ce748', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '26a6c206-3f87-4a1c-b08e-7e4fba0abf68 a7c55ab8-6203-4af9-8fdd-4cbb6d4c41f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8674c23-6131-45a3-8d73-f4e7ea0ce5e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:52:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:53.940 104135 INFO neutron.agent.ovn.metadata.agent [-] Port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe in datapath e3bdfa79-ec53-4a7f-b83a-e5086bff52fd unbound from our chassis#033[00m
Jan 22 17:52:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:53.943 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:52:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:53.945 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9be206-3151-42bd-a1b4-4b6ed61577fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:53 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:53.945 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd namespace which is not needed anymore#033[00m
Jan 22 17:52:53 np0005592767 nova_compute[182623]: 2026-01-22 22:52:53.950 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:53 np0005592767 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 22 17:52:53 np0005592767 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ab.scope: Consumed 14.270s CPU time.
Jan 22 17:52:53 np0005592767 systemd-machined[153912]: Machine qemu-90-instance-000000ab terminated.
Jan 22 17:52:54 np0005592767 podman[238774]: 2026-01-22 22:52:54.036805377 +0000 UTC m=+0.092757284 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 17:52:54 np0005592767 podman[238778]: 2026-01-22 22:52:54.039116173 +0000 UTC m=+0.092197659 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 22 17:52:54 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [NOTICE]   (238513) : haproxy version is 2.8.14-c23fe91
Jan 22 17:52:54 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [NOTICE]   (238513) : path to executable is /usr/sbin/haproxy
Jan 22 17:52:54 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [WARNING]  (238513) : Exiting Master process...
Jan 22 17:52:54 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [ALERT]    (238513) : Current worker (238515) exited with code 143 (Terminated)
Jan 22 17:52:54 np0005592767 neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd[238509]: [WARNING]  (238513) : All workers exited. Exiting... (0)
Jan 22 17:52:54 np0005592767 systemd[1]: libpod-c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5.scope: Deactivated successfully.
Jan 22 17:52:54 np0005592767 podman[238833]: 2026-01-22 22:52:54.073621409 +0000 UTC m=+0.045825487 container died c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 17:52:54 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5-userdata-shm.mount: Deactivated successfully.
Jan 22 17:52:54 np0005592767 systemd[1]: var-lib-containers-storage-overlay-01cb9c7037f382bcb486ebf964c75c4b7a85b972c2dc114c755bf1a10e87e69a-merged.mount: Deactivated successfully.
Jan 22 17:52:54 np0005592767 podman[238833]: 2026-01-22 22:52:54.105856431 +0000 UTC m=+0.078060519 container cleanup c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 17:52:54 np0005592767 systemd[1]: libpod-conmon-c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5.scope: Deactivated successfully.
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.140 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.164 182627 INFO nova.virt.libvirt.driver [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Instance destroyed successfully.#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.165 182627 DEBUG nova.objects.instance [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid 16d58120-504d-4493-a805-fd0f148ce748 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:52:54 np0005592767 podman[238872]: 2026-01-22 22:52:54.172102904 +0000 UTC m=+0.045461546 container remove c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.177 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed8710e-9b95-4041-8e26-e0ed3743d4c3]: (4, ('Thu Jan 22 10:52:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd (c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5)\nc140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5\nThu Jan 22 10:52:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd (c140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5)\nc140252283f6d2ede7181a430da25ff77e045f4b218d8cc3750e9f3610f8a2a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.178 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[08a125f7-dab3-4ae6-9f1f-d8737ad16045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.179 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bdfa79-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.180 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 kernel: tape3bdfa79-e0: left promiscuous mode
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.183 182627 DEBUG nova.virt.libvirt.vif [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:51:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-673622970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=171,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0cGK3LKHwzdETPR4h/wsFOlQTgLcTBgGmh+oIVxr2QEdAAv0pBtVq/m3C1CfcZqChuJGcFGvftkc/0Ge0AL9LRFqxGJjVh++AZArtI8LDhTzymo8V5qBXUKlxchJ4llA==',key_name='tempest-TestSecurityGroupsBasicOps-1788231567',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:51:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-r2019l3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:51:55Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=16d58120-504d-4493-a805-fd0f148ce748,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.183 182627 DEBUG nova.network.os_vif_util [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.184 182627 DEBUG nova.network.os_vif_util [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.184 182627 DEBUG os_vif [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.187 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.187 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbea3ca79-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.189 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.195 182627 DEBUG nova.compute.manager [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-unplugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.195 182627 DEBUG oslo_concurrency.lockutils [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.196 182627 DEBUG oslo_concurrency.lockutils [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.196 182627 DEBUG oslo_concurrency.lockutils [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.196 182627 DEBUG nova.compute.manager [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] No waiting events found dispatching network-vif-unplugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.196 182627 DEBUG nova.compute.manager [req-242d588b-74ec-4e89-a856-16534c2b1796 req-3fab973c-1a85-4db3-9f5b-e1569c099d53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-unplugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.200 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.202 182627 INFO os_vif [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:6d:78,bridge_name='br-int',has_traffic_filtering=True,id=bea3ca79-ba9e-4ec4-b852-24da44f1a2fe,network=Network(e3bdfa79-ec53-4a7f-b83a-e5086bff52fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea3ca79-ba')#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.202 182627 INFO nova.virt.libvirt.driver [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Deleting instance files /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748_del#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.203 182627 INFO nova.virt.libvirt.driver [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Deletion of /var/lib/nova/instances/16d58120-504d-4493-a805-fd0f148ce748_del complete#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.204 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7aecb0-eb72-4025-a11a-b4459722811f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.232 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ca17181b-eb7e-46b6-a457-3b04397496d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.233 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[49a6f0b1-1a41-4fdd-a7b5-62b3ef69dd6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.242 182627 DEBUG nova.compute.manager [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.242 182627 DEBUG nova.compute.manager [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing instance network info cache due to event network-changed-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.242 182627 DEBUG oslo_concurrency.lockutils [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.242 182627 DEBUG oslo_concurrency.lockutils [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.242 182627 DEBUG nova.network.neutron [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Refreshing network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.252 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb6e124-1ede-4e6e-9fca-f9ab7602bf92]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591143, 'reachable_time': 34234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238899, 'error': None, 'target': 'ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 systemd[1]: run-netns-ovnmeta\x2de3bdfa79\x2dec53\x2d4a7f\x2db83a\x2de5086bff52fd.mount: Deactivated successfully.
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.255 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3bdfa79-ec53-4a7f-b83a-e5086bff52fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:52:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:52:54.256 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a37df438-aad9-4bf8-bdff-d55eaab0ec83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.295 182627 INFO nova.compute.manager [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.295 182627 DEBUG oslo.service.loopingcall [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.295 182627 DEBUG nova.compute.manager [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:52:54 np0005592767 nova_compute[182623]: 2026-01-22 22:52:54.296 182627 DEBUG nova.network.neutron [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.684 182627 DEBUG nova.network.neutron [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.701 182627 INFO nova.compute.manager [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Took 1.41 seconds to deallocate network for instance.#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.778 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.779 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.806 182627 DEBUG nova.compute.manager [req-10de6fed-6272-449d-a2c0-f2d337b1bfaf req-b8c72b3c-c0e6-43bc-9d48-00f96b49b4f6 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-deleted-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.865 182627 DEBUG nova.compute.provider_tree [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.877 182627 DEBUG nova.scheduler.client.report [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.893 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.919 182627 INFO nova.scheduler.client.report [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance 16d58120-504d-4493-a805-fd0f148ce748#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.962 182627 DEBUG nova.network.neutron [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updated VIF entry in instance network info cache for port bea3ca79-ba9e-4ec4-b852-24da44f1a2fe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:52:55 np0005592767 nova_compute[182623]: 2026-01-22 22:52:55.962 182627 DEBUG nova.network.neutron [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Updating instance_info_cache with network_info: [{"id": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "address": "fa:16:3e:60:6d:78", "network": {"id": "e3bdfa79-ec53-4a7f-b83a-e5086bff52fd", "bridge": "br-int", "label": "tempest-network-smoke--1070030353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea3ca79-ba", "ovs_interfaceid": "bea3ca79-ba9e-4ec4-b852-24da44f1a2fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.285 182627 DEBUG nova.compute.manager [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.285 182627 DEBUG oslo_concurrency.lockutils [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "16d58120-504d-4493-a805-fd0f148ce748-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.286 182627 DEBUG oslo_concurrency.lockutils [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.286 182627 DEBUG oslo_concurrency.lockutils [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.286 182627 DEBUG nova.compute.manager [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] No waiting events found dispatching network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.287 182627 WARNING nova.compute.manager [req-bb467acb-f44a-4f9a-8638-e22548060d1c req-d41fca73-3147-4c14-94f5-1c5ead08910e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Received unexpected event network-vif-plugged-bea3ca79-ba9e-4ec4-b852-24da44f1a2fe for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.297 182627 DEBUG oslo_concurrency.lockutils [req-7e0db5b1-2e24-44ea-ac25-8c1c33fab0c8 req-f16fa920-ae92-4dfa-b442-d7bf31e6b92c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-16d58120-504d-4493-a805-fd0f148ce748" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:52:56 np0005592767 nova_compute[182623]: 2026-01-22 22:52:56.330 182627 DEBUG oslo_concurrency.lockutils [None req-b19157f9-aa84-4cf7-9a70-8dd5ff2123a9 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "16d58120-504d-4493-a805-fd0f148ce748" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:52:57 np0005592767 nova_compute[182623]: 2026-01-22 22:52:57.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:52:59 np0005592767 nova_compute[182623]: 2026-01-22 22:52:59.189 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:02 np0005592767 podman[238901]: 2026-01-22 22:53:02.146658783 +0000 UTC m=+0.057272241 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:53:02 np0005592767 podman[238900]: 2026-01-22 22:53:02.1539966 +0000 UTC m=+0.068347384 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 17:53:02 np0005592767 nova_compute[182623]: 2026-01-22 22:53:02.951 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:04 np0005592767 nova_compute[182623]: 2026-01-22 22:53:04.389 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:06 np0005592767 nova_compute[182623]: 2026-01-22 22:53:06.403 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:06 np0005592767 nova_compute[182623]: 2026-01-22 22:53:06.535 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:07 np0005592767 podman[238944]: 2026-01-22 22:53:07.163418377 +0000 UTC m=+0.078402299 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:53:07 np0005592767 nova_compute[182623]: 2026-01-22 22:53:07.980 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:09 np0005592767 nova_compute[182623]: 2026-01-22 22:53:09.162 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122374.161065, 16d58120-504d-4493-a805-fd0f148ce748 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:53:09 np0005592767 nova_compute[182623]: 2026-01-22 22:53:09.163 182627 INFO nova.compute.manager [-] [instance: 16d58120-504d-4493-a805-fd0f148ce748] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:53:09 np0005592767 nova_compute[182623]: 2026-01-22 22:53:09.195 182627 DEBUG nova.compute.manager [None req-588cfad4-4cb4-483c-9519-3187c88056c4 - - - - - -] [instance: 16d58120-504d-4493-a805-fd0f148ce748] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:53:09 np0005592767 nova_compute[182623]: 2026-01-22 22:53:09.392 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:12.124 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:12 np0005592767 nova_compute[182623]: 2026-01-22 22:53:12.346 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:12 np0005592767 nova_compute[182623]: 2026-01-22 22:53:12.347 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:12 np0005592767 nova_compute[182623]: 2026-01-22 22:53:12.376 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.838 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.839 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.839 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.849 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.850 182627 INFO nova.compute.claims [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:53:13 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.984 182627 DEBUG nova.compute.provider_tree [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:13.999 182627 DEBUG nova.scheduler.client.report [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.026 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.028 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.109 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.110 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.133 182627 INFO nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.152 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.284 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.285 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.286 182627 INFO nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Creating image(s)#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.286 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.287 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.288 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.305 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.371 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.372 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.373 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.383 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.408 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.468 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.469 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.525 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.526 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.526 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.616 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.617 182627 DEBUG nova.virt.disk.api [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.617 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.687 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.688 182627 DEBUG nova.virt.disk.api [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.688 182627 DEBUG nova.objects.instance [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid 8de59b8f-1662-4616-8bf5-101b1cfaa332 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.996 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.997 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Ensure instance console log exists: /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.997 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.998 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:14 np0005592767 nova_compute[182623]: 2026-01-22 22:53:14.998 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:15 np0005592767 nova_compute[182623]: 2026-01-22 22:53:15.114 182627 DEBUG nova.policy [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:53:17 np0005592767 nova_compute[182623]: 2026-01-22 22:53:17.134 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Successfully created port: 3b7c3312-c221-4ab3-bea1-750b26bea2a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:53:17 np0005592767 nova_compute[182623]: 2026-01-22 22:53:17.984 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:19 np0005592767 podman[238984]: 2026-01-22 22:53:19.165902537 +0000 UTC m=+0.078787659 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.410 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.484 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Successfully updated port: 3b7c3312-c221-4ab3-bea1-750b26bea2a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.499 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.499 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.500 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.639 182627 DEBUG nova.compute.manager [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.640 182627 DEBUG nova.compute.manager [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing instance network info cache due to event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.640 182627 DEBUG oslo_concurrency.lockutils [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:53:19 np0005592767 nova_compute[182623]: 2026-01-22 22:53:19.713 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:53:22 np0005592767 nova_compute[182623]: 2026-01-22 22:53:22.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:24 np0005592767 podman[239007]: 2026-01-22 22:53:24.134613413 +0000 UTC m=+0.051548101 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git)
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.185 182627 DEBUG nova.network.neutron [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:24 np0005592767 podman[239006]: 2026-01-22 22:53:24.22355038 +0000 UTC m=+0.133967933 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.238 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.239 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Instance network_info: |[{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.239 182627 DEBUG oslo_concurrency.lockutils [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.240 182627 DEBUG nova.network.neutron [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.244 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Start _get_guest_xml network_info=[{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.249 182627 WARNING nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.262 182627 DEBUG nova.virt.libvirt.host [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.263 182627 DEBUG nova.virt.libvirt.host [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.267 182627 DEBUG nova.virt.libvirt.host [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.268 182627 DEBUG nova.virt.libvirt.host [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.270 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.270 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.271 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.272 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.272 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.273 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.273 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.274 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.274 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.275 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.275 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.276 182627 DEBUG nova.virt.hardware [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.282 182627 DEBUG nova.virt.libvirt.vif [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:53:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1145811343',display_name='tempest-TestGettingAddress-server-1145811343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1145811343',id=174,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqXnqrMY5CwIl+chhiY+V0KiwfjE1aQfBJnCBBn6VRqwsdAUJbkY/c6j2vgsYGCBMAwbME+hHoqS/0xDjD33F+ugZSs4aTJnliui0z8I9qIqN8oagqLB9AWr4k85MMj2w==',key_name='tempest-TestGettingAddress-1051750305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-xd6jcfm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:53:14Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=8de59b8f-1662-4616-8bf5-101b1cfaa332,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.282 182627 DEBUG nova.network.os_vif_util [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.284 182627 DEBUG nova.network.os_vif_util [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.285 182627 DEBUG nova.objects.instance [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 8de59b8f-1662-4616-8bf5-101b1cfaa332 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.307 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <uuid>8de59b8f-1662-4616-8bf5-101b1cfaa332</uuid>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <name>instance-000000ae</name>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestGettingAddress-server-1145811343</nova:name>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:53:24</nova:creationTime>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        <nova:port uuid="3b7c3312-c221-4ab3-bea1-750b26bea2a7">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe95:95f3" ipVersion="6"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe95:95f3" ipVersion="6"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="serial">8de59b8f-1662-4616-8bf5-101b1cfaa332</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="uuid">8de59b8f-1662-4616-8bf5-101b1cfaa332</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.config"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:95:95:f3"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <target dev="tap3b7c3312-c2"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/console.log" append="off"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:53:24 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:53:24 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:53:24 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:53:24 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.309 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Preparing to wait for external event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.309 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.310 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.310 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.310 182627 DEBUG nova.virt.libvirt.vif [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:53:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1145811343',display_name='tempest-TestGettingAddress-server-1145811343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1145811343',id=174,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqXnqrMY5CwIl+chhiY+V0KiwfjE1aQfBJnCBBn6VRqwsdAUJbkY/c6j2vgsYGCBMAwbME+hHoqS/0xDjD33F+ugZSs4aTJnliui0z8I9qIqN8oagqLB9AWr4k85MMj2w==',key_name='tempest-TestGettingAddress-1051750305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-xd6jcfm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:53:14Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=8de59b8f-1662-4616-8bf5-101b1cfaa332,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.311 182627 DEBUG nova.network.os_vif_util [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.312 182627 DEBUG nova.network.os_vif_util [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.312 182627 DEBUG os_vif [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.313 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.313 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.313 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.316 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.316 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b7c3312-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.316 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b7c3312-c2, col_values=(('external_ids', {'iface-id': '3b7c3312-c221-4ab3-bea1-750b26bea2a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:95:f3', 'vm-uuid': '8de59b8f-1662-4616-8bf5-101b1cfaa332'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.318 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:24 np0005592767 NetworkManager[54973]: <info>  [1769122404.3197] manager: (tap3b7c3312-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.322 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.325 182627 INFO os_vif [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2')#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.398 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.398 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.398 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:95:95:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:53:24 np0005592767 nova_compute[182623]: 2026-01-22 22:53:24.399 182627 INFO nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Using config drive#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.440 182627 INFO nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Creating config drive at /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.config#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.452 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe9w9vu_x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.585 182627 DEBUG oslo_concurrency.processutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe9w9vu_x" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:25 np0005592767 kernel: tap3b7c3312-c2: entered promiscuous mode
Jan 22 17:53:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:25Z|00761|binding|INFO|Claiming lport 3b7c3312-c221-4ab3-bea1-750b26bea2a7 for this chassis.
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.6482] manager: (tap3b7c3312-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/356)
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.647 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:25Z|00762|binding|INFO|3b7c3312-c221-4ab3-bea1-750b26bea2a7: Claiming fa:16:3e:95:95:f3 10.100.0.7 2001:db8:0:1:f816:3eff:fe95:95f3 2001:db8::f816:3eff:fe95:95f3
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.660 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.663 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.6643] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.6652] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.672 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:95:f3 10.100.0.7 2001:db8:0:1:f816:3eff:fe95:95f3 2001:db8::f816:3eff:fe95:95f3'], port_security=['fa:16:3e:95:95:f3 10.100.0.7 2001:db8:0:1:f816:3eff:fe95:95f3 2001:db8::f816:3eff:fe95:95f3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8:0:1:f816:3eff:fe95:95f3/64 2001:db8::f816:3eff:fe95:95f3/64', 'neutron:device_id': '8de59b8f-1662-4616-8bf5-101b1cfaa332', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a18a174b-7c08-4298-a017-58661bd62402', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b7c3312-c221-4ab3-bea1-750b26bea2a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.674 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b7c3312-c221-4ab3-bea1-750b26bea2a7 in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 bound to our chassis#033[00m
Jan 22 17:53:25 np0005592767 systemd-udevd[239072]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.675 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.686 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[43e2a6ee-7986-470f-82ac-30f3c3a15ef9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.687 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbcbb187b-81 in ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.6893] device (tap3b7c3312-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.689 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbcbb187b-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.689 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b71a3302-fd34-4b33-a0c1-ebc663aad433]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.690 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[adde3fcc-8e69-4f9e-bb6f-d43b7d85fcd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.6908] device (tap3b7c3312-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:53:25 np0005592767 systemd-machined[153912]: New machine qemu-91-instance-000000ae.
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.705 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[21413af5-5d4a-458f-ab70-b2ae1b901191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.732 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc7167f-4c9d-4a2f-ab31-7df5f3cd0259]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.763 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd0ea5a-530f-44a8-a6ca-e9e3e6f428c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.768 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f122338c-e47d-4e4c-97cb-b8d91bcdbad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 systemd-udevd[239076]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.7711] manager: (tapbcbb187b-80): new Veth device (/org/freedesktop/NetworkManager/Devices/359)
Jan 22 17:53:25 np0005592767 systemd[1]: Started Virtual Machine qemu-91-instance-000000ae.
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.791 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:25Z|00763|binding|INFO|Setting lport 3b7c3312-c221-4ab3-bea1-750b26bea2a7 ovn-installed in OVS
Jan 22 17:53:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:25Z|00764|binding|INFO|Setting lport 3b7c3312-c221-4ab3-bea1-750b26bea2a7 up in Southbound
Jan 22 17:53:25 np0005592767 nova_compute[182623]: 2026-01-22 22:53:25.802 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.802 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0fa0ae-1a4e-4ccf-bbe9-5fa4cce4af49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.805 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b947d6-f681-4111-bb75-121b5643104e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 NetworkManager[54973]: <info>  [1769122405.8299] device (tapbcbb187b-80): carrier: link connected
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.836 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[66e4ad12-aa93-4d25-97d8-918e52566ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.855 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[84fe8c81-ae51-4007-9a0a-69b2f75475a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcbb187b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:99:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600240, 'reachable_time': 44361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239105, 'error': None, 'target': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.874 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[127c7976-08dc-447f-96d6-d7a31d3c67be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:995a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600240, 'tstamp': 600240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239107, 'error': None, 'target': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.893 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[793edf58-db02-4d77-a50f-7c4557cf4338]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcbb187b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:99:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600240, 'reachable_time': 44361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239108, 'error': None, 'target': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.925 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1f4250-4411-4b2f-992c-e88de5ba078a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:25 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:25.998 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0d49dd-1fa6-424d-a837-ab5f2ff486e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.000 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcbb187b-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.000 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.001 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcbb187b-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:26 np0005592767 kernel: tapbcbb187b-80: entered promiscuous mode
Jan 22 17:53:26 np0005592767 NetworkManager[54973]: <info>  [1769122406.0028] manager: (tapbcbb187b-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.002 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.006 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbcbb187b-80, col_values=(('external_ids', {'iface-id': '7d34b7c4-140b-4dda-ad27-aa5734d5709c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.007 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:26 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:26Z|00765|binding|INFO|Releasing lport 7d34b7c4-140b-4dda-ad27-aa5734d5709c from this chassis (sb_readonly=0)
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.008 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bcbb187b-81b7-4e4f-9a13-417cae17c3c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bcbb187b-81b7-4e4f-9a13-417cae17c3c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.009 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[82c4345c-a917-45bf-9bf8-177cea5f5e13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.009 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-bcbb187b-81b7-4e4f-9a13-417cae17c3c3
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/bcbb187b-81b7-4e4f-9a13-417cae17c3c3.pid.haproxy
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID bcbb187b-81b7-4e4f-9a13-417cae17c3c3
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.010 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'env', 'PROCESS_TAG=haproxy-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bcbb187b-81b7-4e4f-9a13-417cae17c3c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.019 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.296 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.296 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:53:26 np0005592767 podman[239140]: 2026-01-22 22:53:26.486675003 +0000 UTC m=+0.060672799 container create e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:53:26 np0005592767 systemd[1]: Started libpod-conmon-e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d.scope.
Jan 22 17:53:26 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:53:26 np0005592767 podman[239140]: 2026-01-22 22:53:26.449809919 +0000 UTC m=+0.023807725 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:53:26 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efb0ffa2afdb22e263595d21d985d08daee78863ee53087bd04b5ae063fad62f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:53:26 np0005592767 podman[239140]: 2026-01-22 22:53:26.564703052 +0000 UTC m=+0.138700828 container init e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:53:26 np0005592767 podman[239140]: 2026-01-22 22:53:26.57241052 +0000 UTC m=+0.146408296 container start e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 17:53:26 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [NOTICE]   (239167) : New worker (239169) forked
Jan 22 17:53:26 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [NOTICE]   (239167) : Loading success.
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.612 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122406.6112804, 8de59b8f-1662-4616-8bf5-101b1cfaa332 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.612 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] VM Started (Lifecycle Event)#033[00m
Jan 22 17:53:26 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:26.632 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.694 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.699 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122406.611588, 8de59b8f-1662-4616-8bf5-101b1cfaa332 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.699 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.758 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.761 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.778 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.807 182627 DEBUG nova.compute.manager [req-299a697f-cc5b-433f-808a-11822d4c6d5b req-6b10f3b3-194b-4d3a-b3d7-8aaabe755160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.808 182627 DEBUG oslo_concurrency.lockutils [req-299a697f-cc5b-433f-808a-11822d4c6d5b req-6b10f3b3-194b-4d3a-b3d7-8aaabe755160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.808 182627 DEBUG oslo_concurrency.lockutils [req-299a697f-cc5b-433f-808a-11822d4c6d5b req-6b10f3b3-194b-4d3a-b3d7-8aaabe755160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.809 182627 DEBUG oslo_concurrency.lockutils [req-299a697f-cc5b-433f-808a-11822d4c6d5b req-6b10f3b3-194b-4d3a-b3d7-8aaabe755160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.809 182627 DEBUG nova.compute.manager [req-299a697f-cc5b-433f-808a-11822d4c6d5b req-6b10f3b3-194b-4d3a-b3d7-8aaabe755160 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Processing event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.810 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.815 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122406.8153992, 8de59b8f-1662-4616-8bf5-101b1cfaa332 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.816 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.821 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.826 182627 INFO nova.virt.libvirt.driver [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Instance spawned successfully.#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.826 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.885 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.893 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.894 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.895 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.895 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.896 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.897 182627 DEBUG nova.virt.libvirt.driver [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.904 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:53:26 np0005592767 nova_compute[182623]: 2026-01-22 22:53:26.960 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:53:27 np0005592767 nova_compute[182623]: 2026-01-22 22:53:27.020 182627 INFO nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Took 12.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:53:27 np0005592767 nova_compute[182623]: 2026-01-22 22:53:27.020 182627 DEBUG nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:53:27 np0005592767 nova_compute[182623]: 2026-01-22 22:53:27.162 182627 INFO nova.compute.manager [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Took 14.71 seconds to build instance.#033[00m
Jan 22 17:53:27 np0005592767 nova_compute[182623]: 2026-01-22 22:53:27.215 182627 DEBUG oslo_concurrency.lockutils [None req-a5b30eba-0adf-479f-b210-aa7bb56d9fd3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:27 np0005592767 nova_compute[182623]: 2026-01-22 22:53:27.987 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.920 182627 DEBUG nova.compute.manager [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.921 182627 DEBUG oslo_concurrency.lockutils [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.921 182627 DEBUG oslo_concurrency.lockutils [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.922 182627 DEBUG oslo_concurrency.lockutils [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.922 182627 DEBUG nova.compute.manager [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] No waiting events found dispatching network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:53:28 np0005592767 nova_compute[182623]: 2026-01-22 22:53:28.922 182627 WARNING nova.compute.manager [req-bb6d35a9-c603-469d-b6de-d09128215def req-eb4d66e8-05ef-402e-bee8-247f1475c67d 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received unexpected event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:53:29 np0005592767 nova_compute[182623]: 2026-01-22 22:53:29.319 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:29 np0005592767 nova_compute[182623]: 2026-01-22 22:53:29.418 182627 DEBUG nova.network.neutron [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updated VIF entry in instance network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:53:29 np0005592767 nova_compute[182623]: 2026-01-22 22:53:29.419 182627 DEBUG nova.network.neutron [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:29 np0005592767 nova_compute[182623]: 2026-01-22 22:53:29.473 182627 DEBUG oslo_concurrency.lockutils [req-1f27eb29-826e-4dc0-ae97-eedf35801b4c req-429ebb46-b483-4643-91b6-96124169bd46 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:53:30 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:30.635 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:31 np0005592767 nova_compute[182623]: 2026-01-22 22:53:31.661 182627 DEBUG nova.compute.manager [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:31 np0005592767 nova_compute[182623]: 2026-01-22 22:53:31.662 182627 DEBUG nova.compute.manager [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing instance network info cache due to event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:53:31 np0005592767 nova_compute[182623]: 2026-01-22 22:53:31.662 182627 DEBUG oslo_concurrency.lockutils [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:53:31 np0005592767 nova_compute[182623]: 2026-01-22 22:53:31.663 182627 DEBUG oslo_concurrency.lockutils [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:53:31 np0005592767 nova_compute[182623]: 2026-01-22 22:53:31.663 182627 DEBUG nova.network.neutron [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:53:32 np0005592767 nova_compute[182623]: 2026-01-22 22:53:32.989 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:33 np0005592767 podman[239178]: 2026-01-22 22:53:33.148047758 +0000 UTC m=+0.056616794 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 22 17:53:33 np0005592767 podman[239179]: 2026-01-22 22:53:33.180136926 +0000 UTC m=+0.078458572 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:53:34 np0005592767 nova_compute[182623]: 2026-01-22 22:53:34.322 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:34 np0005592767 nova_compute[182623]: 2026-01-22 22:53:34.560 182627 DEBUG nova.network.neutron [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updated VIF entry in instance network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:53:34 np0005592767 nova_compute[182623]: 2026-01-22 22:53:34.561 182627 DEBUG nova.network.neutron [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:34 np0005592767 nova_compute[182623]: 2026-01-22 22:53:34.620 182627 DEBUG oslo_concurrency.lockutils [req-76c0f567-a3a6-41ad-bd6f-77713d241ef8 req-dce387e6-adb5-439d-a157-41988108fa21 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:53:34 np0005592767 nova_compute[182623]: 2026-01-22 22:53:34.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:35 np0005592767 nova_compute[182623]: 2026-01-22 22:53:35.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:35 np0005592767 nova_compute[182623]: 2026-01-22 22:53:35.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:53:35 np0005592767 nova_compute[182623]: 2026-01-22 22:53:35.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:53:36 np0005592767 nova_compute[182623]: 2026-01-22 22:53:36.133 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:53:36 np0005592767 nova_compute[182623]: 2026-01-22 22:53:36.133 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:53:36 np0005592767 nova_compute[182623]: 2026-01-22 22:53:36.134 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:53:36 np0005592767 nova_compute[182623]: 2026-01-22 22:53:36.134 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8de59b8f-1662-4616-8bf5-101b1cfaa332 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:53:37 np0005592767 nova_compute[182623]: 2026-01-22 22:53:37.992 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:38 np0005592767 podman[239236]: 2026-01-22 22:53:38.139152474 +0000 UTC m=+0.045475288 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:53:39 np0005592767 nova_compute[182623]: 2026-01-22 22:53:39.336 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:39Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:95:f3 10.100.0.7
Jan 22 17:53:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:39Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:95:f3 10.100.0.7
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.179 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.207 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.207 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.207 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.208 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.208 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.208 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.240 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.241 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.241 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.241 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.312 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.384 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.385 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.445 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.619 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.620 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5541MB free_disk=73.0236587524414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.621 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.621 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.702 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 8de59b8f-1662-4616-8bf5-101b1cfaa332 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.703 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.703 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.787 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.816 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.841 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.842 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:42 np0005592767 nova_compute[182623]: 2026-01-22 22:53:42.996 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:44 np0005592767 nova_compute[182623]: 2026-01-22 22:53:44.339 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:45 np0005592767 nova_compute[182623]: 2026-01-22 22:53:45.538 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:45 np0005592767 nova_compute[182623]: 2026-01-22 22:53:45.538 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:45 np0005592767 nova_compute[182623]: 2026-01-22 22:53:45.538 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:53:48 np0005592767 nova_compute[182623]: 2026-01-22 22:53:48.002 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:48 np0005592767 nova_compute[182623]: 2026-01-22 22:53:48.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.341 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.806 182627 DEBUG nova.compute.manager [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.807 182627 DEBUG nova.compute.manager [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing instance network info cache due to event network-changed-3b7c3312-c221-4ab3-bea1-750b26bea2a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.807 182627 DEBUG oslo_concurrency.lockutils [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.808 182627 DEBUG oslo_concurrency.lockutils [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.808 182627 DEBUG nova.network.neutron [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Refreshing network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.880 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.881 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.881 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.882 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.882 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.898 182627 INFO nova.compute.manager [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Terminating instance#033[00m
Jan 22 17:53:49 np0005592767 nova_compute[182623]: 2026-01-22 22:53:49.909 182627 DEBUG nova.compute.manager [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:53:49 np0005592767 kernel: tap3b7c3312-c2 (unregistering): left promiscuous mode
Jan 22 17:53:49 np0005592767 NetworkManager[54973]: <info>  [1769122429.9435] device (tap3b7c3312-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:50Z|00766|binding|INFO|Releasing lport 3b7c3312-c221-4ab3-bea1-750b26bea2a7 from this chassis (sb_readonly=0)
Jan 22 17:53:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:50Z|00767|binding|INFO|Setting lport 3b7c3312-c221-4ab3-bea1-750b26bea2a7 down in Southbound
Jan 22 17:53:50 np0005592767 ovn_controller[94769]: 2026-01-22T22:53:50Z|00768|binding|INFO|Removing iface tap3b7c3312-c2 ovn-installed in OVS
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.017 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.033 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:95:f3 10.100.0.7 2001:db8:0:1:f816:3eff:fe95:95f3 2001:db8::f816:3eff:fe95:95f3'], port_security=['fa:16:3e:95:95:f3 10.100.0.7 2001:db8:0:1:f816:3eff:fe95:95f3 2001:db8::f816:3eff:fe95:95f3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8:0:1:f816:3eff:fe95:95f3/64 2001:db8::f816:3eff:fe95:95f3/64', 'neutron:device_id': '8de59b8f-1662-4616-8bf5-101b1cfaa332', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a18a174b-7c08-4298-a017-58661bd62402', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35e54559-2757-4541-b2f6-2cb439f23e24, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=3b7c3312-c221-4ab3-bea1-750b26bea2a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.036 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 3b7c3312-c221-4ab3-bea1-750b26bea2a7 in datapath bcbb187b-81b7-4e4f-9a13-417cae17c3c3 unbound from our chassis#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.038 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.040 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[88d887f4-9304-490e-bca1-3f8eb49a4cbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.041 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3 namespace which is not needed anymore#033[00m
Jan 22 17:53:50 np0005592767 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Jan 22 17:53:50 np0005592767 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000ae.scope: Consumed 13.191s CPU time.
Jan 22 17:53:50 np0005592767 systemd-machined[153912]: Machine qemu-91-instance-000000ae terminated.
Jan 22 17:53:50 np0005592767 podman[239271]: 2026-01-22 22:53:50.08320346 +0000 UTC m=+0.099589250 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.133 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.173 182627 INFO nova.virt.libvirt.driver [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Instance destroyed successfully.#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.174 182627 DEBUG nova.objects.instance [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid 8de59b8f-1662-4616-8bf5-101b1cfaa332 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.189 182627 DEBUG nova.virt.libvirt.vif [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:53:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1145811343',display_name='tempest-TestGettingAddress-server-1145811343',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1145811343',id=174,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqXnqrMY5CwIl+chhiY+V0KiwfjE1aQfBJnCBBn6VRqwsdAUJbkY/c6j2vgsYGCBMAwbME+hHoqS/0xDjD33F+ugZSs4aTJnliui0z8I9qIqN8oagqLB9AWr4k85MMj2w==',key_name='tempest-TestGettingAddress-1051750305',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:53:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-xd6jcfm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:53:27Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=8de59b8f-1662-4616-8bf5-101b1cfaa332,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.189 182627 DEBUG nova.network.os_vif_util [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.191 182627 DEBUG nova.network.os_vif_util [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:53:50 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [NOTICE]   (239167) : haproxy version is 2.8.14-c23fe91
Jan 22 17:53:50 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [NOTICE]   (239167) : path to executable is /usr/sbin/haproxy
Jan 22 17:53:50 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [WARNING]  (239167) : Exiting Master process...
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.192 182627 DEBUG os_vif [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:53:50 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [ALERT]    (239167) : Current worker (239169) exited with code 143 (Terminated)
Jan 22 17:53:50 np0005592767 neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3[239159]: [WARNING]  (239167) : All workers exited. Exiting... (0)
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.195 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 systemd[1]: libpod-e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d.scope: Deactivated successfully.
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.196 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b7c3312-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.197 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.199 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.201 182627 INFO os_vif [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:95:f3,bridge_name='br-int',has_traffic_filtering=True,id=3b7c3312-c221-4ab3-bea1-750b26bea2a7,network=Network(bcbb187b-81b7-4e4f-9a13-417cae17c3c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b7c3312-c2')#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.202 182627 INFO nova.virt.libvirt.driver [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Deleting instance files /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332_del#033[00m
Jan 22 17:53:50 np0005592767 podman[239317]: 2026-01-22 22:53:50.202260861 +0000 UTC m=+0.049841302 container died e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.203 182627 INFO nova.virt.libvirt.driver [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Deletion of /var/lib/nova/instances/8de59b8f-1662-4616-8bf5-101b1cfaa332_del complete#033[00m
Jan 22 17:53:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d-userdata-shm.mount: Deactivated successfully.
Jan 22 17:53:50 np0005592767 systemd[1]: var-lib-containers-storage-overlay-efb0ffa2afdb22e263595d21d985d08daee78863ee53087bd04b5ae063fad62f-merged.mount: Deactivated successfully.
Jan 22 17:53:50 np0005592767 podman[239317]: 2026-01-22 22:53:50.245762253 +0000 UTC m=+0.093342694 container cleanup e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:53:50 np0005592767 systemd[1]: libpod-conmon-e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d.scope: Deactivated successfully.
Jan 22 17:53:50 np0005592767 podman[239356]: 2026-01-22 22:53:50.304833835 +0000 UTC m=+0.041126215 container remove e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.309 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[24875599-6b79-4796-947b-bb1811ea22d3]: (4, ('Thu Jan 22 10:53:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3 (e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d)\ne5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d\nThu Jan 22 10:53:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3 (e5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d)\ne5cb0d0749dfeaaa18855d76e840a0c22304ae58b1c8de2ff47599230d3d472d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.312 182627 INFO nova.compute.manager [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.312 182627 DEBUG oslo.service.loopingcall [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.312 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b591d7-63c7-47df-99b9-5686b0e8fcd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.313 182627 DEBUG nova.compute.manager [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.313 182627 DEBUG nova.network.neutron [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.314 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcbb187b-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:53:50 np0005592767 kernel: tapbcbb187b-80: left promiscuous mode
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.319 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 nova_compute[182623]: 2026-01-22 22:53:50.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.331 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5d8a6d-509c-4504-844c-5c50479d5ba5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.347 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[32904060-d7a5-4ac9-82ef-ae2ca2d612bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.348 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eab9e497-3a01-4db2-9f6d-0800ce270806]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.362 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[50a054a2-ac57-457e-a481-1d38996763b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600233, 'reachable_time': 21533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239371, 'error': None, 'target': 'ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:50 np0005592767 systemd[1]: run-netns-ovnmeta\x2dbcbb187b\x2d81b7\x2d4e4f\x2d9a13\x2d417cae17c3c3.mount: Deactivated successfully.
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.367 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bcbb187b-81b7-4e4f-9a13-417cae17c3c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:53:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:53:50.367 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b5c5e4-7011-41d4-9de7-aae735b6f116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.258 182627 DEBUG nova.network.neutron [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.278 182627 INFO nova.compute.manager [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Took 0.97 seconds to deallocate network for instance.#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.375 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.375 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.448 182627 DEBUG nova.compute.provider_tree [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.469 182627 DEBUG nova.scheduler.client.report [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.491 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.527 182627 INFO nova.scheduler.client.report [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance 8de59b8f-1662-4616-8bf5-101b1cfaa332#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.639 182627 DEBUG oslo_concurrency.lockutils [None req-461bc6e6-2f43-4460-88b9-ebb352ee8530 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.966 182627 DEBUG nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-vif-unplugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.966 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.967 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.967 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.967 182627 DEBUG nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] No waiting events found dispatching network-vif-unplugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.968 182627 WARNING nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received unexpected event network-vif-unplugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.968 182627 DEBUG nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.968 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.969 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.969 182627 DEBUG oslo_concurrency.lockutils [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "8de59b8f-1662-4616-8bf5-101b1cfaa332-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.969 182627 DEBUG nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] No waiting events found dispatching network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.969 182627 WARNING nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received unexpected event network-vif-plugged-3b7c3312-c221-4ab3-bea1-750b26bea2a7 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:53:51 np0005592767 nova_compute[182623]: 2026-01-22 22:53:51.970 182627 DEBUG nova.compute.manager [req-683e60d6-d22c-4753-8136-f1f370e629aa req-99385569-22c6-459c-afa6-2ca1a53cc5db 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Received event network-vif-deleted-3b7c3312-c221-4ab3-bea1-750b26bea2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:53:52 np0005592767 nova_compute[182623]: 2026-01-22 22:53:52.557 182627 DEBUG nova.network.neutron [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updated VIF entry in instance network info cache for port 3b7c3312-c221-4ab3-bea1-750b26bea2a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:53:52 np0005592767 nova_compute[182623]: 2026-01-22 22:53:52.558 182627 DEBUG nova.network.neutron [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Updating instance_info_cache with network_info: [{"id": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "address": "fa:16:3e:95:95:f3", "network": {"id": "bcbb187b-81b7-4e4f-9a13-417cae17c3c3", "bridge": "br-int", "label": "tempest-network-smoke--416755164", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:95f3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b7c3312-c2", "ovs_interfaceid": "3b7c3312-c221-4ab3-bea1-750b26bea2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:53:52 np0005592767 nova_compute[182623]: 2026-01-22 22:53:52.585 182627 DEBUG oslo_concurrency.lockutils [req-14fb46c6-f7b2-4ad7-8353-a4a332e8c971 req-7ca6d620-2967-4d3e-a098-ee9a55fbfdf5 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-8de59b8f-1662-4616-8bf5-101b1cfaa332" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:53:53 np0005592767 nova_compute[182623]: 2026-01-22 22:53:53.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:55 np0005592767 podman[239373]: 2026-01-22 22:53:55.140053298 +0000 UTC m=+0.053306140 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 17:53:55 np0005592767 podman[239372]: 2026-01-22 22:53:55.158263613 +0000 UTC m=+0.073143662 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:53:55 np0005592767 nova_compute[182623]: 2026-01-22 22:53:55.199 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:53:58 np0005592767 nova_compute[182623]: 2026-01-22 22:53:58.048 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:00 np0005592767 nova_compute[182623]: 2026-01-22 22:54:00.255 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:03 np0005592767 nova_compute[182623]: 2026-01-22 22:54:03.049 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:04 np0005592767 podman[239421]: 2026-01-22 22:54:04.156237101 +0000 UTC m=+0.066148394 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:54:04 np0005592767 podman[239420]: 2026-01-22 22:54:04.163033824 +0000 UTC m=+0.084206525 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:54:05 np0005592767 nova_compute[182623]: 2026-01-22 22:54:05.173 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122430.1709166, 8de59b8f-1662-4616-8bf5-101b1cfaa332 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:54:05 np0005592767 nova_compute[182623]: 2026-01-22 22:54:05.173 182627 INFO nova.compute.manager [-] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:54:05 np0005592767 nova_compute[182623]: 2026-01-22 22:54:05.198 182627 DEBUG nova.compute.manager [None req-a1a07e1d-ab13-4a56-89f3-8c5b95999a8f - - - - - -] [instance: 8de59b8f-1662-4616-8bf5-101b1cfaa332] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:54:05 np0005592767 nova_compute[182623]: 2026-01-22 22:54:05.259 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:06 np0005592767 nova_compute[182623]: 2026-01-22 22:54:06.238 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:06 np0005592767 nova_compute[182623]: 2026-01-22 22:54:06.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:54:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:54:08 np0005592767 nova_compute[182623]: 2026-01-22 22:54:08.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:09 np0005592767 podman[239467]: 2026-01-22 22:54:09.158926325 +0000 UTC m=+0.079464420 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:54:10 np0005592767 nova_compute[182623]: 2026-01-22 22:54:10.264 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:12.126 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:13 np0005592767 nova_compute[182623]: 2026-01-22 22:54:13.075 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:15 np0005592767 nova_compute[182623]: 2026-01-22 22:54:15.268 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:18 np0005592767 nova_compute[182623]: 2026-01-22 22:54:18.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:20 np0005592767 nova_compute[182623]: 2026-01-22 22:54:20.272 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:21 np0005592767 podman[239491]: 2026-01-22 22:54:21.172124139 +0000 UTC m=+0.079632585 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:54:23 np0005592767 nova_compute[182623]: 2026-01-22 22:54:23.113 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:25 np0005592767 nova_compute[182623]: 2026-01-22 22:54:25.277 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:26 np0005592767 podman[239513]: 2026-01-22 22:54:26.133146105 +0000 UTC m=+0.053107005 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Jan 22 17:54:26 np0005592767 podman[239512]: 2026-01-22 22:54:26.208759054 +0000 UTC m=+0.121355355 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:54:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:27.642 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8::f816:3eff:fecb:140a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0ee482bf-08e3-4c08-b19d-3799def66c4e) old=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:54:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:27.643 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0ee482bf-08e3-4c08-b19d-3799def66c4e in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 updated#033[00m
Jan 22 17:54:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:27.644 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dcb11b3-4f88-477e-8e29-469839246ce6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:54:27 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:27.646 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9f5689-7533-4dd9-93fd-e2f97a4cb741]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:54:28 np0005592767 nova_compute[182623]: 2026-01-22 22:54:28.115 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:30 np0005592767 nova_compute[182623]: 2026-01-22 22:54:30.280 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:30 np0005592767 nova_compute[182623]: 2026-01-22 22:54:30.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:31.262 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:54:31 np0005592767 nova_compute[182623]: 2026-01-22 22:54:31.263 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:31 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:31.264 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:54:33 np0005592767 nova_compute[182623]: 2026-01-22 22:54:33.117 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:35 np0005592767 podman[239560]: 2026-01-22 22:54:35.180038367 +0000 UTC m=+0.084709080 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 22 17:54:35 np0005592767 podman[239561]: 2026-01-22 22:54:35.188481956 +0000 UTC m=+0.082814586 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:54:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:35.188 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8:0:1:f816:3eff:fecb:140a 2001:db8::f816:3eff:fecb:140a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fecb:140a/64 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3589f7d-c75d-4b83-bcd4-26c17c46dcf1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0ee482bf-08e3-4c08-b19d-3799def66c4e) old=Port_Binding(mac=['fa:16:3e:cb:14:0a 10.100.0.2 2001:db8::f816:3eff:fecb:140a'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fecb:140a/64', 'neutron:device_id': 'ovnmeta-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0dcb11b3-4f88-477e-8e29-469839246ce6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:54:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:35.189 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0ee482bf-08e3-4c08-b19d-3799def66c4e in datapath 0dcb11b3-4f88-477e-8e29-469839246ce6 updated#033[00m
Jan 22 17:54:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:35.191 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0dcb11b3-4f88-477e-8e29-469839246ce6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:54:35 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:35.191 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[21b7c17f-78a9-4b95-b157-8d19f2c39ce0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:54:35 np0005592767 nova_compute[182623]: 2026-01-22 22:54:35.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:35 np0005592767 nova_compute[182623]: 2026-01-22 22:54:35.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:36 np0005592767 nova_compute[182623]: 2026-01-22 22:54:36.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:36 np0005592767 nova_compute[182623]: 2026-01-22 22:54:36.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:54:36 np0005592767 nova_compute[182623]: 2026-01-22 22:54:36.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:54:36 np0005592767 nova_compute[182623]: 2026-01-22 22:54:36.919 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:54:37 np0005592767 nova_compute[182623]: 2026-01-22 22:54:37.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:38 np0005592767 nova_compute[182623]: 2026-01-22 22:54:38.161 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:54:39.267 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:54:40 np0005592767 podman[239603]: 2026-01-22 22:54:40.161969754 +0000 UTC m=+0.076826386 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.329 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.932 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.932 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.933 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:40 np0005592767 nova_compute[182623]: 2026-01-22 22:54:40.933 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.091 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.093 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.05239868164062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.093 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.093 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.223 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.224 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.254 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.276 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.305 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:54:41 np0005592767 nova_compute[182623]: 2026-01-22 22:54:41.306 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:54:42Z|00769|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 17:54:43 np0005592767 nova_compute[182623]: 2026-01-22 22:54:43.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:43 np0005592767 nova_compute[182623]: 2026-01-22 22:54:43.306 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:45 np0005592767 nova_compute[182623]: 2026-01-22 22:54:45.331 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:45 np0005592767 nova_compute[182623]: 2026-01-22 22:54:45.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:46 np0005592767 nova_compute[182623]: 2026-01-22 22:54:46.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:46 np0005592767 nova_compute[182623]: 2026-01-22 22:54:46.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:54:46 np0005592767 nova_compute[182623]: 2026-01-22 22:54:46.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:46 np0005592767 nova_compute[182623]: 2026-01-22 22:54:46.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:54:46 np0005592767 nova_compute[182623]: 2026-01-22 22:54:46.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 17:54:48 np0005592767 nova_compute[182623]: 2026-01-22 22:54:48.168 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:50 np0005592767 nova_compute[182623]: 2026-01-22 22:54:50.336 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:51 np0005592767 nova_compute[182623]: 2026-01-22 22:54:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:51 np0005592767 nova_compute[182623]: 2026-01-22 22:54:51.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:54:52 np0005592767 podman[239628]: 2026-01-22 22:54:52.129887517 +0000 UTC m=+0.052883409 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 17:54:53 np0005592767 nova_compute[182623]: 2026-01-22 22:54:53.169 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:53 np0005592767 nova_compute[182623]: 2026-01-22 22:54:53.923 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:54:55 np0005592767 nova_compute[182623]: 2026-01-22 22:54:55.397 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:56 np0005592767 nova_compute[182623]: 2026-01-22 22:54:56.930 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:56 np0005592767 nova_compute[182623]: 2026-01-22 22:54:56.931 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:56 np0005592767 nova_compute[182623]: 2026-01-22 22:54:56.961 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:54:57 np0005592767 podman[239651]: 2026-01-22 22:54:57.137314016 +0000 UTC m=+0.055642107 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.158 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.159 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.165 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.165 182627 INFO nova.compute.claims [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:54:57 np0005592767 podman[239650]: 2026-01-22 22:54:57.184013248 +0000 UTC m=+0.098117849 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.337 182627 DEBUG nova.compute.provider_tree [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.354 182627 DEBUG nova.scheduler.client.report [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.389 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.390 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.479 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.479 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.501 182627 INFO nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.524 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.670 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.671 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.672 182627 INFO nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Creating image(s)#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.673 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.673 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.674 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.695 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.758 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.759 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.760 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.780 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.866 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.867 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.907 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.908 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.908 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.963 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.966 182627 DEBUG nova.virt.disk.api [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.967 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:54:57 np0005592767 nova_compute[182623]: 2026-01-22 22:54:57.992 182627 DEBUG nova.policy [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.050 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.051 182627 DEBUG nova.virt.disk.api [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.051 182627 DEBUG nova.objects.instance [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid c468b88d-d414-4f0b-af1c-9b13676f4f04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.067 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.068 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Ensure instance console log exists: /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.068 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.068 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.069 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:54:58 np0005592767 nova_compute[182623]: 2026-01-22 22:54:58.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:54:59 np0005592767 nova_compute[182623]: 2026-01-22 22:54:59.680 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Successfully created port: f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:55:00 np0005592767 nova_compute[182623]: 2026-01-22 22:55:00.402 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.269 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Successfully updated port: f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.294 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.294 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.295 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.412 182627 DEBUG nova.compute.manager [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.413 182627 DEBUG nova.compute.manager [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing instance network info cache due to event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:55:01 np0005592767 nova_compute[182623]: 2026-01-22 22:55:01.414 182627 DEBUG oslo_concurrency.lockutils [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:02 np0005592767 nova_compute[182623]: 2026-01-22 22:55:02.179 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:55:03 np0005592767 nova_compute[182623]: 2026-01-22 22:55:03.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.173 182627 DEBUG nova.network.neutron [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updating instance_info_cache with network_info: [{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.193 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.193 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Instance network_info: |[{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.194 182627 DEBUG oslo_concurrency.lockutils [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.194 182627 DEBUG nova.network.neutron [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.198 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Start _get_guest_xml network_info=[{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.203 182627 WARNING nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.208 182627 DEBUG nova.virt.libvirt.host [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.208 182627 DEBUG nova.virt.libvirt.host [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.213 182627 DEBUG nova.virt.libvirt.host [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.214 182627 DEBUG nova.virt.libvirt.host [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.215 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.215 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.215 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.216 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.216 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.216 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.217 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.217 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.217 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.217 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.218 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.218 182627 DEBUG nova.virt.hardware [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.222 182627 DEBUG nova.virt.libvirt.vif [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=178,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-qm2gyyqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:54:57Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=c468b88d-d414-4f0b-af1c-9b13676f4f04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.222 182627 DEBUG nova.network.os_vif_util [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.223 182627 DEBUG nova.network.os_vif_util [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.224 182627 DEBUG nova.objects.instance [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid c468b88d-d414-4f0b-af1c-9b13676f4f04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.241 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <uuid>c468b88d-d414-4f0b-af1c-9b13676f4f04</uuid>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <name>instance-000000b2</name>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474</nova:name>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:55:04</nova:creationTime>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        <nova:port uuid="f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="serial">c468b88d-d414-4f0b-af1c-9b13676f4f04</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="uuid">c468b88d-d414-4f0b-af1c-9b13676f4f04</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.config"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:43:28:38"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <target dev="tapf1f8dea9-5a"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/console.log" append="off"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:55:04 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:55:04 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:55:04 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:55:04 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.242 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Preparing to wait for external event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.242 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:04 np0005592767 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.242 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.243 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.243 182627 DEBUG nova.virt.libvirt.vif [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=178,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-qm2gyyqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:54:57Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=c468b88d-d414-4f0b-af1c-9b13676f4f04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.244 182627 DEBUG nova.network.os_vif_util [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.244 182627 DEBUG nova.network.os_vif_util [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.244 182627 DEBUG os_vif [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.245 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.245 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.245 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.247 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.248 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1f8dea9-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.248 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf1f8dea9-5a, col_values=(('external_ids', {'iface-id': 'f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:28:38', 'vm-uuid': 'c468b88d-d414-4f0b-af1c-9b13676f4f04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:04 np0005592767 NetworkManager[54973]: <info>  [1769122504.2507] manager: (tapf1f8dea9-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.249 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.257 182627 INFO os_vif [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a')#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.310 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.311 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.311 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:43:28:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:55:04 np0005592767 nova_compute[182623]: 2026-01-22 22:55:04.312 182627 INFO nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Using config drive#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.173 182627 INFO nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Creating config drive at /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.config#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.184 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0e_eaz1y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.318 182627 DEBUG oslo_concurrency.processutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0e_eaz1y" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:05 np0005592767 kernel: tapf1f8dea9-5a: entered promiscuous mode
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.4493] manager: (tapf1f8dea9-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/362)
Jan 22 17:55:05 np0005592767 systemd-udevd[239744]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.449 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.454 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:05Z|00770|binding|INFO|Claiming lport f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd for this chassis.
Jan 22 17:55:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:05Z|00771|binding|INFO|f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd: Claiming fa:16:3e:43:28:38 10.100.0.8
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.4700] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.4706] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.469 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.4713] device (tapf1f8dea9-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.4718] device (tapf1f8dea9-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.475 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:28:38 10.100.0.8'], port_security=['fa:16:3e:43:28:38 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c468b88d-d414-4f0b-af1c-9b13676f4f04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77169cf9-4ddb-4a48-a907-fa4abc0d69fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd7cbcb5-c231-44bb-be1c-c0898fbee74d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.476 104135 INFO neutron.agent.ovn.metadata.agent [-] Port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd in datapath 930b9b12-ffcc-452a-86e1-0321bc77aa71 bound to our chassis#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.477 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 930b9b12-ffcc-452a-86e1-0321bc77aa71#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.488 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d282fad7-e899-4e28-b186-80ebf7755a01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.489 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap930b9b12-f1 in ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.491 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap930b9b12-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.491 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3959f0-db38-4f2e-a1e5-208400c70488]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.492 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[90d044bf-f0b4-4b4e-ad1a-85663f0fc0da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.504 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[fe74a62b-0c3c-4f03-bcb6-e55e7dd4d8db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 podman[239723]: 2026-01-22 22:55:05.508011365 +0000 UTC m=+0.110783797 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 17:55:05 np0005592767 podman[239724]: 2026-01-22 22:55:05.50994418 +0000 UTC m=+0.105795237 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:55:05 np0005592767 systemd-machined[153912]: New machine qemu-92-instance-000000b2.
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.534 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6783c5-cba2-4df1-837b-bb60c2af94ae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.565 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[44b8423b-6d00-42eb-914f-e5f42e6b01db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 systemd[1]: Started Virtual Machine qemu-92-instance-000000b2.
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.568 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 systemd-udevd[239750]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.5733] manager: (tap930b9b12-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/365)
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.572 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5c00f8a0-1df2-4e62-8c96-5cc642603eba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.583 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:05Z|00772|binding|INFO|Setting lport f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd ovn-installed in OVS
Jan 22 17:55:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:05Z|00773|binding|INFO|Setting lport f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd up in Southbound
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.594 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.608 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[812757e7-ea98-4d12-bf57-8c89373dbc57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.611 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb064ef-651e-4bbe-811b-71ad275a4311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.6320] device (tap930b9b12-f0): carrier: link connected
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.636 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[da01e2c7-7660-4ead-97a6-a5c64d69c959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.651 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[033f12a9-427a-4c87-b333-2385945e3203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap930b9b12-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:93:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610221, 'reachable_time': 16192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239801, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.664 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ac9aaa-fd4e-4ee2-8b19-a2f0a0931c2f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:9313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610221, 'tstamp': 610221}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239803, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.677 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ade051-3158-4916-91bd-41bed95068df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap930b9b12-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:93:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610221, 'reachable_time': 16192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239804, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.702 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9bad2a-d227-4052-837b-62d571d0481b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.759 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9408681c-e59e-46d0-adf4-8cde0143ac3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.760 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap930b9b12-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.760 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.761 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap930b9b12-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.762 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 kernel: tap930b9b12-f0: entered promiscuous mode
Jan 22 17:55:05 np0005592767 NetworkManager[54973]: <info>  [1769122505.7644] manager: (tap930b9b12-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.764 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.764 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap930b9b12-f0, col_values=(('external_ids', {'iface-id': '5422293b-7f51-474a-850b-2710ef12aac0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.765 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:05Z|00774|binding|INFO|Releasing lport 5422293b-7f51-474a-850b-2710ef12aac0 from this chassis (sb_readonly=0)
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.777 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.778 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.779 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb6794e-dcaa-4b97-ac87-fad10cc66ab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.780 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-930b9b12-ffcc-452a-86e1-0321bc77aa71
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/930b9b12-ffcc-452a-86e1-0321bc77aa71.pid.haproxy
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 930b9b12-ffcc-452a-86e1-0321bc77aa71
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:55:05 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:05.780 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'env', 'PROCESS_TAG=haproxy-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/930b9b12-ffcc-452a-86e1-0321bc77aa71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.896 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122505.8960729, c468b88d-d414-4f0b-af1c-9b13676f4f04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.897 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] VM Started (Lifecycle Event)#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.914 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.918 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122505.896841, c468b88d-d414-4f0b-af1c-9b13676f4f04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.919 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.933 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.936 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:55:05 np0005592767 nova_compute[182623]: 2026-01-22 22:55:05.953 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:55:06 np0005592767 podman[239843]: 2026-01-22 22:55:06.130036045 +0000 UTC m=+0.050805460 container create e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:55:06 np0005592767 systemd[1]: Started libpod-conmon-e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09.scope.
Jan 22 17:55:06 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:55:06 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7c014303a8e3a6df966c87a60d4d75e11c84a1268b70333f1622126408aa756/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:55:06 np0005592767 podman[239843]: 2026-01-22 22:55:06.100618102 +0000 UTC m=+0.021387537 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:55:06 np0005592767 podman[239843]: 2026-01-22 22:55:06.206610303 +0000 UTC m=+0.127379748 container init e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 17:55:06 np0005592767 podman[239843]: 2026-01-22 22:55:06.211851361 +0000 UTC m=+0.132620776 container start e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:55:06 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [NOTICE]   (239863) : New worker (239865) forked
Jan 22 17:55:06 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [NOTICE]   (239863) : Loading success.
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.334 182627 DEBUG nova.compute.manager [req-4cc26598-0ca9-4eb3-8c0c-07374b992d98 req-a539b5d0-50a6-46bd-b3ff-341cfd2dd09c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.335 182627 DEBUG oslo_concurrency.lockutils [req-4cc26598-0ca9-4eb3-8c0c-07374b992d98 req-a539b5d0-50a6-46bd-b3ff-341cfd2dd09c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.335 182627 DEBUG oslo_concurrency.lockutils [req-4cc26598-0ca9-4eb3-8c0c-07374b992d98 req-a539b5d0-50a6-46bd-b3ff-341cfd2dd09c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.335 182627 DEBUG oslo_concurrency.lockutils [req-4cc26598-0ca9-4eb3-8c0c-07374b992d98 req-a539b5d0-50a6-46bd-b3ff-341cfd2dd09c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.336 182627 DEBUG nova.compute.manager [req-4cc26598-0ca9-4eb3-8c0c-07374b992d98 req-a539b5d0-50a6-46bd-b3ff-341cfd2dd09c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Processing event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.336 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.340 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122506.340315, c468b88d-d414-4f0b-af1c-9b13676f4f04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.340 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.342 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.345 182627 INFO nova.virt.libvirt.driver [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Instance spawned successfully.#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.345 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.363 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.367 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.389 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.390 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.390 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.391 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.392 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.392 182627 DEBUG nova.virt.libvirt.driver [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.398 182627 DEBUG nova.network.neutron [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updated VIF entry in instance network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.399 182627 DEBUG nova.network.neutron [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updating instance_info_cache with network_info: [{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.400 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.434 182627 DEBUG oslo_concurrency.lockutils [req-9c04200d-77de-4ea2-8bd3-3a5e3472a4b1 req-09c1bc07-0513-46e3-92fe-966c47e2af93 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.478 182627 INFO nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Took 8.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.478 182627 DEBUG nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.542 182627 INFO nova.compute.manager [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Took 9.51 seconds to build instance.#033[00m
Jan 22 17:55:06 np0005592767 nova_compute[182623]: 2026-01-22 22:55:06.558 182627 DEBUG oslo_concurrency.lockutils [None req-0e33937a-1e51-438a-b805-dd3b30b791d4 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.175 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.432 182627 DEBUG nova.compute.manager [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.433 182627 DEBUG oslo_concurrency.lockutils [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.433 182627 DEBUG oslo_concurrency.lockutils [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.433 182627 DEBUG oslo_concurrency.lockutils [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.433 182627 DEBUG nova.compute.manager [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] No waiting events found dispatching network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:55:08 np0005592767 nova_compute[182623]: 2026-01-22 22:55:08.434 182627 WARNING nova.compute.manager [req-ce50bef4-c50c-49fe-a403-fec9d1faa858 req-5c9d6abc-8791-49dc-a51b-6b9d1df97d17 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received unexpected event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd for instance with vm_state active and task_state None.#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.280 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:09.502 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:55:09 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:09.504 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.668 182627 DEBUG nova.compute.manager [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.669 182627 DEBUG nova.compute.manager [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing instance network info cache due to event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.669 182627 DEBUG oslo_concurrency.lockutils [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.669 182627 DEBUG oslo_concurrency.lockutils [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:09 np0005592767 nova_compute[182623]: 2026-01-22 22:55:09.669 182627 DEBUG nova.network.neutron [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:55:11 np0005592767 podman[239874]: 2026-01-22 22:55:11.135353194 +0000 UTC m=+0.053569038 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.172 182627 DEBUG nova.network.neutron [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updated VIF entry in instance network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.173 182627 DEBUG nova.network.neutron [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updating instance_info_cache with network_info: [{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.194 182627 DEBUG oslo_concurrency.lockutils [req-bf9fd25d-96b5-4704-964a-0fae3e1040c8 req-bdd420d1-f2a6-4983-a0e1-b03e005c5965 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.319 182627 DEBUG nova.compute.manager [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.320 182627 DEBUG nova.compute.manager [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing instance network info cache due to event network-changed-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.320 182627 DEBUG oslo_concurrency.lockutils [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.320 182627 DEBUG oslo_concurrency.lockutils [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:11 np0005592767 nova_compute[182623]: 2026-01-22 22:55:11.320 182627 DEBUG nova.network.neutron [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Refreshing network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:55:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:12.125 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:12.126 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:12.127 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:13 np0005592767 nova_compute[182623]: 2026-01-22 22:55:13.177 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:13 np0005592767 nova_compute[182623]: 2026-01-22 22:55:13.298 182627 DEBUG nova.network.neutron [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updated VIF entry in instance network info cache for port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:55:13 np0005592767 nova_compute[182623]: 2026-01-22 22:55:13.298 182627 DEBUG nova.network.neutron [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updating instance_info_cache with network_info: [{"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:13 np0005592767 nova_compute[182623]: 2026-01-22 22:55:13.315 182627 DEBUG oslo_concurrency.lockutils [req-8d320038-429b-4fb6-aba5-aecafbfb6ae0 req-d0fb6fa2-9893-4356-b397-839b2cf0d6eb 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-c468b88d-d414-4f0b-af1c-9b13676f4f04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:55:14 np0005592767 nova_compute[182623]: 2026-01-22 22:55:14.283 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:14 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:14.506 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:17Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:28:38 10.100.0.8
Jan 22 17:55:17 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:17Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:28:38 10.100.0.8
Jan 22 17:55:18 np0005592767 nova_compute[182623]: 2026-01-22 22:55:18.179 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:19 np0005592767 nova_compute[182623]: 2026-01-22 22:55:19.287 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:23 np0005592767 podman[239916]: 2026-01-22 22:55:23.17825589 +0000 UTC m=+0.097593924 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:55:23 np0005592767 nova_compute[182623]: 2026-01-22 22:55:23.182 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.057 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.057 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.058 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.058 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.058 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.075 182627 INFO nova.compute.manager [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Terminating instance#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.089 182627 DEBUG nova.compute.manager [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:55:24 np0005592767 kernel: tapf1f8dea9-5a (unregistering): left promiscuous mode
Jan 22 17:55:24 np0005592767 NetworkManager[54973]: <info>  [1769122524.1175] device (tapf1f8dea9-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:55:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:24Z|00775|binding|INFO|Releasing lport f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd from this chassis (sb_readonly=0)
Jan 22 17:55:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:24Z|00776|binding|INFO|Setting lport f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd down in Southbound
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.125 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:24Z|00777|binding|INFO|Removing iface tapf1f8dea9-5a ovn-installed in OVS
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.132 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:28:38 10.100.0.8', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c468b88d-d414-4f0b-af1c-9b13676f4f04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd7cbcb5-c231-44bb-be1c-c0898fbee74d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.134 104135 INFO neutron.agent.ovn.metadata.agent [-] Port f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd in datapath 930b9b12-ffcc-452a-86e1-0321bc77aa71 unbound from our chassis#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.136 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 930b9b12-ffcc-452a-86e1-0321bc77aa71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.137 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[de510a98-b0aa-46a5-a55a-7de73efa8ffb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.137 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 namespace which is not needed anymore#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Jan 22 17:55:24 np0005592767 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b2.scope: Consumed 11.881s CPU time.
Jan 22 17:55:24 np0005592767 systemd-machined[153912]: Machine qemu-92-instance-000000b2 terminated.
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [NOTICE]   (239863) : haproxy version is 2.8.14-c23fe91
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [NOTICE]   (239863) : path to executable is /usr/sbin/haproxy
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [WARNING]  (239863) : Exiting Master process...
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [WARNING]  (239863) : Exiting Master process...
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [ALERT]    (239863) : Current worker (239865) exited with code 143 (Terminated)
Jan 22 17:55:24 np0005592767 neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71[239859]: [WARNING]  (239863) : All workers exited. Exiting... (0)
Jan 22 17:55:24 np0005592767 systemd[1]: libpod-e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09.scope: Deactivated successfully.
Jan 22 17:55:24 np0005592767 podman[239961]: 2026-01-22 22:55:24.272724426 +0000 UTC m=+0.051258032 container died e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.289 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09-userdata-shm.mount: Deactivated successfully.
Jan 22 17:55:24 np0005592767 systemd[1]: var-lib-containers-storage-overlay-f7c014303a8e3a6df966c87a60d4d75e11c84a1268b70333f1622126408aa756-merged.mount: Deactivated successfully.
Jan 22 17:55:24 np0005592767 podman[239961]: 2026-01-22 22:55:24.315581429 +0000 UTC m=+0.094115015 container cleanup e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:55:24 np0005592767 systemd[1]: libpod-conmon-e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09.scope: Deactivated successfully.
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.368 182627 INFO nova.virt.libvirt.driver [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Instance destroyed successfully.#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.369 182627 DEBUG nova.objects.instance [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid c468b88d-d414-4f0b-af1c-9b13676f4f04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.384 182627 DEBUG nova.virt.libvirt.vif [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-gen-1-620159474',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-gen',id=178,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF3bM7NQGtG5yriXbrRucGgbXDO+Ih+16YwqKkCqi6iYV70uP3NcZajayx6+zddPMbVIGqZDPRroiJyEP2VGI5ncm7A4UPQ1aQzuh23PRRclUINeQtZ2TOXHx39xQlJ4JA==',key_name='tempest-TestSecurityGroupsBasicOps-227428177',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:55:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-qm2gyyqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:55:06Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=c468b88d-d414-4f0b-af1c-9b13676f4f04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.384 182627 DEBUG nova.network.os_vif_util [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "address": "fa:16:3e:43:28:38", "network": {"id": "930b9b12-ffcc-452a-86e1-0321bc77aa71", "bridge": "br-int", "label": "tempest-network-smoke--317568404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1f8dea9-5a", "ovs_interfaceid": "f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.385 182627 DEBUG nova.network.os_vif_util [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.385 182627 DEBUG os_vif [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.387 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.388 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1f8dea9-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.389 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.391 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.392 182627 INFO os_vif [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:28:38,bridge_name='br-int',has_traffic_filtering=True,id=f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd,network=Network(930b9b12-ffcc-452a-86e1-0321bc77aa71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1f8dea9-5a')#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.393 182627 INFO nova.virt.libvirt.driver [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Deleting instance files /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04_del#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.394 182627 INFO nova.virt.libvirt.driver [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Deletion of /var/lib/nova/instances/c468b88d-d414-4f0b-af1c-9b13676f4f04_del complete#033[00m
Jan 22 17:55:24 np0005592767 podman[239998]: 2026-01-22 22:55:24.394540005 +0000 UTC m=+0.051445538 container remove e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.400 182627 DEBUG nova.compute.manager [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-unplugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.400 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd386cf-76f8-4530-b414-b8418e5bee32]: (4, ('Thu Jan 22 10:55:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 (e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09)\ne523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09\nThu Jan 22 10:55:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 (e523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09)\ne523edf8b0f3995536d3668e4b016402a49e9c0e8fb948462c796f30af45ef09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.400 182627 DEBUG oslo_concurrency.lockutils [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.401 182627 DEBUG oslo_concurrency.lockutils [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.401 182627 DEBUG oslo_concurrency.lockutils [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.401 182627 DEBUG nova.compute.manager [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] No waiting events found dispatching network-vif-unplugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.402 182627 DEBUG nova.compute.manager [req-b80553ee-046c-4487-8fa6-eb4e751d874c req-cd81ff8c-c292-4035-a1ee-043a84acc023 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-unplugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.401 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[2629a370-1787-4979-8c03-26bf524add81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.402 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap930b9b12-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.404 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 kernel: tap930b9b12-f0: left promiscuous mode
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.416 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.419 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a93f1afd-fbfe-429f-ad88-456b27bba9e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.442 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8472876a-f08a-4be7-8852-37d68b82629a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.444 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bc43a42f-79f4-481c-aa37-eeb5778102b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.460 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[30cd4515-1517-4d80-a808-c47ba709e0ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610213, 'reachable_time': 34831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240021, 'error': None, 'target': 'ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.462 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-930b9b12-ffcc-452a-86e1-0321bc77aa71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:55:24 np0005592767 systemd[1]: run-netns-ovnmeta\x2d930b9b12\x2dffcc\x2d452a\x2d86e1\x2d0321bc77aa71.mount: Deactivated successfully.
Jan 22 17:55:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:24.463 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[e69bd849-c139-4f59-9f58-ad0268779396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.469 182627 INFO nova.compute.manager [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.470 182627 DEBUG oslo.service.loopingcall [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.470 182627 DEBUG nova.compute.manager [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:55:24 np0005592767 nova_compute[182623]: 2026-01-22 22:55:24.471 182627 DEBUG nova.network.neutron [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.569 182627 DEBUG nova.network.neutron [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.587 182627 INFO nova.compute.manager [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Took 1.12 seconds to deallocate network for instance.#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.674 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.675 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.774 182627 DEBUG nova.compute.provider_tree [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.796 182627 DEBUG nova.scheduler.client.report [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.834 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.870 182627 INFO nova.scheduler.client.report [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance c468b88d-d414-4f0b-af1c-9b13676f4f04#033[00m
Jan 22 17:55:25 np0005592767 nova_compute[182623]: 2026-01-22 22:55:25.949 182627 DEBUG oslo_concurrency.lockutils [None req-9072d250-0b9d-4371-a50a-af1f1890fb4d 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.480 182627 DEBUG nova.compute.manager [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.480 182627 DEBUG oslo_concurrency.lockutils [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.481 182627 DEBUG oslo_concurrency.lockutils [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.481 182627 DEBUG oslo_concurrency.lockutils [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "c468b88d-d414-4f0b-af1c-9b13676f4f04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.482 182627 DEBUG nova.compute.manager [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] No waiting events found dispatching network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.482 182627 WARNING nova.compute.manager [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received unexpected event network-vif-plugged-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:55:26 np0005592767 nova_compute[182623]: 2026-01-22 22:55:26.482 182627 DEBUG nova.compute.manager [req-911ff79e-ba01-45c2-8f20-fb09f77108c8 req-221b96b7-016f-450f-84c8-748b1f6287ec 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Received event network-vif-deleted-f1f8dea9-5aa7-42f4-a890-ddd9e562f8bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:28 np0005592767 podman[240023]: 2026-01-22 22:55:28.143990468 +0000 UTC m=+0.057505749 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Jan 22 17:55:28 np0005592767 nova_compute[182623]: 2026-01-22 22:55:28.191 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:28 np0005592767 podman[240022]: 2026-01-22 22:55:28.20622259 +0000 UTC m=+0.122416526 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 22 17:55:29 np0005592767 nova_compute[182623]: 2026-01-22 22:55:29.427 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:33 np0005592767 nova_compute[182623]: 2026-01-22 22:55:33.230 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:33 np0005592767 nova_compute[182623]: 2026-01-22 22:55:33.877 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:34 np0005592767 nova_compute[182623]: 2026-01-22 22:55:34.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:34 np0005592767 nova_compute[182623]: 2026-01-22 22:55:34.428 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:35 np0005592767 nova_compute[182623]: 2026-01-22 22:55:35.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:36 np0005592767 podman[240068]: 2026-01-22 22:55:36.173800946 +0000 UTC m=+0.075151069 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 17:55:36 np0005592767 podman[240069]: 2026-01-22 22:55:36.175038971 +0000 UTC m=+0.081878599 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:55:36 np0005592767 nova_compute[182623]: 2026-01-22 22:55:36.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:36 np0005592767 nova_compute[182623]: 2026-01-22 22:55:36.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:55:36 np0005592767 nova_compute[182623]: 2026-01-22 22:55:36.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:55:36 np0005592767 nova_compute[182623]: 2026-01-22 22:55:36.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:55:38 np0005592767 nova_compute[182623]: 2026-01-22 22:55:38.231 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:38 np0005592767 nova_compute[182623]: 2026-01-22 22:55:38.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:39 np0005592767 nova_compute[182623]: 2026-01-22 22:55:39.367 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122524.3660908, c468b88d-d414-4f0b-af1c-9b13676f4f04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:55:39 np0005592767 nova_compute[182623]: 2026-01-22 22:55:39.367 182627 INFO nova.compute.manager [-] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:55:39 np0005592767 nova_compute[182623]: 2026-01-22 22:55:39.404 182627 DEBUG nova.compute.manager [None req-f951ade4-2661-4c60-9687-d99fd6ab2828 - - - - - -] [instance: c468b88d-d414-4f0b-af1c-9b13676f4f04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:55:39 np0005592767 nova_compute[182623]: 2026-01-22 22:55:39.429 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:40 np0005592767 nova_compute[182623]: 2026-01-22 22:55:40.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:40 np0005592767 nova_compute[182623]: 2026-01-22 22:55:40.918 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:40 np0005592767 nova_compute[182623]: 2026-01-22 22:55:40.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:40 np0005592767 nova_compute[182623]: 2026-01-22 22:55:40.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:40 np0005592767 nova_compute[182623]: 2026-01-22 22:55:40.919 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.103 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.105 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.05123138427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.105 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.105 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.166 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.167 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.187 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.209 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.228 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:55:41 np0005592767 nova_compute[182623]: 2026-01-22 22:55:41.229 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:42 np0005592767 podman[240110]: 2026-01-22 22:55:42.143074566 +0000 UTC m=+0.064706703 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:55:42 np0005592767 nova_compute[182623]: 2026-01-22 22:55:42.229 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:42 np0005592767 nova_compute[182623]: 2026-01-22 22:55:42.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:43 np0005592767 nova_compute[182623]: 2026-01-22 22:55:43.233 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:44 np0005592767 nova_compute[182623]: 2026-01-22 22:55:44.432 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:47 np0005592767 nova_compute[182623]: 2026-01-22 22:55:47.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:47 np0005592767 nova_compute[182623]: 2026-01-22 22:55:47.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:47 np0005592767 nova_compute[182623]: 2026-01-22 22:55:47.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:55:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:47.910 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:55:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:47.911 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:55:47 np0005592767 nova_compute[182623]: 2026-01-22 22:55:47.911 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:48 np0005592767 nova_compute[182623]: 2026-01-22 22:55:48.235 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:48.913 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:49 np0005592767 nova_compute[182623]: 2026-01-22 22:55:49.435 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:53 np0005592767 nova_compute[182623]: 2026-01-22 22:55:53.237 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:53 np0005592767 nova_compute[182623]: 2026-01-22 22:55:53.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:54 np0005592767 podman[240136]: 2026-01-22 22:55:54.152917757 +0000 UTC m=+0.066519554 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.438 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.727 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.728 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.752 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.867 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.868 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.879 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.879 182627 INFO nova.compute.claims [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:55:54 np0005592767 nova_compute[182623]: 2026-01-22 22:55:54.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.014 182627 DEBUG nova.compute.provider_tree [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.026 182627 DEBUG nova.scheduler.client.report [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.042 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.043 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.089 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.090 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.106 182627 INFO nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.126 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.236 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.237 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.238 182627 INFO nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Creating image(s)#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.238 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.239 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.239 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.256 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.285 182627 DEBUG nova.policy [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.347 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.348 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.349 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.364 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.425 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.426 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.464 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.465 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.466 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.533 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.535 182627 DEBUG nova.virt.disk.api [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Checking if we can resize image /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.537 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.594 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.595 182627 DEBUG nova.virt.disk.api [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Cannot resize image /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.596 182627 DEBUG nova.objects.instance [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'migration_context' on Instance uuid 6f5c0f9b-42bb-425f-aae2-cad89722d21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.626 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.627 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Ensure instance console log exists: /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.628 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.628 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:55 np0005592767 nova_compute[182623]: 2026-01-22 22:55:55.629 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:56 np0005592767 nova_compute[182623]: 2026-01-22 22:55:56.496 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Successfully created port: 4cd35b77-3e61-4cda-8aaf-331882c21f75 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.335 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Successfully updated port: 4cd35b77-3e61-4cda-8aaf-331882c21f75 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.350 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.351 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquired lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.351 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.404 182627 DEBUG nova.compute.manager [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.404 182627 DEBUG nova.compute.manager [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing instance network info cache due to event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.404 182627 DEBUG oslo_concurrency.lockutils [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:55:57 np0005592767 nova_compute[182623]: 2026-01-22 22:55:57.510 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.240 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.468 182627 DEBUG nova.network.neutron [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.971 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Releasing lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.972 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Instance network_info: |[{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.973 182627 DEBUG oslo_concurrency.lockutils [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.973 182627 DEBUG nova.network.neutron [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.977 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Start _get_guest_xml network_info=[{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.984 182627 WARNING nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.989 182627 DEBUG nova.virt.libvirt.host [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:55:58 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.990 182627 DEBUG nova.virt.libvirt.host [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:58.999 182627 DEBUG nova.virt.libvirt.host [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.000 182627 DEBUG nova.virt.libvirt.host [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.001 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.002 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.002 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.002 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.003 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.003 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.003 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.003 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.004 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.004 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.004 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.005 182627 DEBUG nova.virt.hardware [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.009 182627 DEBUG nova.virt.libvirt.vif [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:55:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=180,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD84xZdPIFoxp03M+qy8ubX2yjnPVx4s19faDbEEd2GlXdkDQvPT5ly+rGsLhEu7hsGSTeKn0265dQ5QI8wo2Iew2eNOe20T1gxb3IbUJadj9xf8ub5lKfCQgVpfE1xbBw==',key_name='tempest-TestSecurityGroupsBasicOps-1433766663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-hoe0ttwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:55:55Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=6f5c0f9b-42bb-425f-aae2-cad89722d21a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.009 182627 DEBUG nova.network.os_vif_util [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.010 182627 DEBUG nova.network.os_vif_util [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.012 182627 DEBUG nova.objects.instance [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'pci_devices' on Instance uuid 6f5c0f9b-42bb-425f-aae2-cad89722d21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.035 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <uuid>6f5c0f9b-42bb-425f-aae2-cad89722d21a</uuid>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <name>instance-000000b4</name>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059</nova:name>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:55:58</nova:creationTime>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:user uuid="57cadc74575048b298f2ab431b92531e">tempest-TestSecurityGroupsBasicOps-838015615-project-member</nova:user>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:project uuid="bbcf23c8115e43a0af378f72b41c2f1b">tempest-TestSecurityGroupsBasicOps-838015615</nova:project>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        <nova:port uuid="4cd35b77-3e61-4cda-8aaf-331882c21f75">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="serial">6f5c0f9b-42bb-425f-aae2-cad89722d21a</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="uuid">6f5c0f9b-42bb-425f-aae2-cad89722d21a</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.config"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:14:58:50"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <target dev="tap4cd35b77-3e"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/console.log" append="off"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:55:59 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:55:59 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:55:59 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:55:59 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.037 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Preparing to wait for external event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.037 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.037 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.038 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.038 182627 DEBUG nova.virt.libvirt.vif [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:55:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=180,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD84xZdPIFoxp03M+qy8ubX2yjnPVx4s19faDbEEd2GlXdkDQvPT5ly+rGsLhEu7hsGSTeKn0265dQ5QI8wo2Iew2eNOe20T1gxb3IbUJadj9xf8ub5lKfCQgVpfE1xbBw==',key_name='tempest-TestSecurityGroupsBasicOps-1433766663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-hoe0ttwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:55:55Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=6f5c0f9b-42bb-425f-aae2-cad89722d21a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.039 182627 DEBUG nova.network.os_vif_util [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.040 182627 DEBUG nova.network.os_vif_util [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.040 182627 DEBUG os_vif [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.041 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.042 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.047 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4cd35b77-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.047 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4cd35b77-3e, col_values=(('external_ids', {'iface-id': '4cd35b77-3e61-4cda-8aaf-331882c21f75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:58:50', 'vm-uuid': '6f5c0f9b-42bb-425f-aae2-cad89722d21a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.091 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.0925] manager: (tap4cd35b77-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.093 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.101 182627 INFO os_vif [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e')#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.148 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.149 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.149 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] No VIF found with MAC fa:16:3e:14:58:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.150 182627 INFO nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Using config drive#033[00m
Jan 22 17:55:59 np0005592767 podman[240176]: 2026-01-22 22:55:59.196216912 +0000 UTC m=+0.068148661 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Jan 22 17:55:59 np0005592767 podman[240174]: 2026-01-22 22:55:59.209472097 +0000 UTC m=+0.088297171 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.590 182627 INFO nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Creating config drive at /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.config#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.595 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5rf2fdxt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.719 182627 DEBUG oslo_concurrency.processutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5rf2fdxt" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:55:59 np0005592767 kernel: tap4cd35b77-3e: entered promiscuous mode
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.8027] manager: (tap4cd35b77-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.803 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:59Z|00778|binding|INFO|Claiming lport 4cd35b77-3e61-4cda-8aaf-331882c21f75 for this chassis.
Jan 22 17:55:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:59Z|00779|binding|INFO|4cd35b77-3e61-4cda-8aaf-331882c21f75: Claiming fa:16:3e:14:58:50 10.100.0.5
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.816 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:58:50 10.100.0.5'], port_security=['fa:16:3e:14:58:50 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2166113-b582-4acb-94d4-cca8c2589ce5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '800afde8-d00b-4348-b69b-b46974950191 e084b098-262d-45bd-97a6-6f8d267bf424', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2d1b52b-e43f-4c02-8d8d-bb5591e6462d, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4cd35b77-3e61-4cda-8aaf-331882c21f75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.819 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4cd35b77-3e61-4cda-8aaf-331882c21f75 in datapath b2166113-b582-4acb-94d4-cca8c2589ce5 bound to our chassis#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.821 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2166113-b582-4acb-94d4-cca8c2589ce5#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.841 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[98348765-1478-4e02-9e6e-9b7e704be983]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.842 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2166113-b1 in ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.845 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2166113-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.845 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b0157603-c537-4daa-80bb-301e9da9cbb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.846 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e2334f3e-edac-409f-aa58-f64d6d5883cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 systemd-machined[153912]: New machine qemu-93-instance-000000b4.
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.856 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[a9794826-5a04-4eda-acdc-82821ab7932f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.864 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:59Z|00780|binding|INFO|Setting lport 4cd35b77-3e61-4cda-8aaf-331882c21f75 ovn-installed in OVS
Jan 22 17:55:59 np0005592767 ovn_controller[94769]: 2026-01-22T22:55:59Z|00781|binding|INFO|Setting lport 4cd35b77-3e61-4cda-8aaf-331882c21f75 up in Southbound
Jan 22 17:55:59 np0005592767 nova_compute[182623]: 2026-01-22 22:55:59.870 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.870 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[c34e015f-6bbc-43f9-8857-dc1ab7f530d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 systemd[1]: Started Virtual Machine qemu-93-instance-000000b4.
Jan 22 17:55:59 np0005592767 systemd-udevd[240243]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.8986] device (tap4cd35b77-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.9006] device (tap4cd35b77-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.915 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4c47bf5b-3161-4ce2-aa5f-e64e36396bd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.921 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a056bb11-ed19-47a8-a007-e275d623c335]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.9228] manager: (tapb2166113-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/369)
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.964 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[869ca637-328b-491c-8b90-af96acce82cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.967 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[8628f1b9-2d02-4bba-afed-c75a16b30e62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:55:59 np0005592767 NetworkManager[54973]: <info>  [1769122559.9918] device (tapb2166113-b0): carrier: link connected
Jan 22 17:55:59 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:55:59.996 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[e0423b8e-ca98-4b35-ab1c-cf6a91d78459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.012 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d651b206-d3b1-4284-80f1-bf1e02210ae0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2166113-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:1c:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615656, 'reachable_time': 23365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240272, 'error': None, 'target': 'ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.025 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9a42bd5c-8ddc-414e-bbec-3ad7330d3818]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:1cf6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 615656, 'tstamp': 615656}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240273, 'error': None, 'target': 'ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.041 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5e56f6-bebd-472b-9aa3-3b4e388621a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2166113-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:1c:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615656, 'reachable_time': 23365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240274, 'error': None, 'target': 'ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2baa8f-a74a-4080-8229-22128c5e80c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.148 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dd058955-efef-4bfa-8929-da4e98175514]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.150 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2166113-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.151 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.152 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2166113-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.196 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:00 np0005592767 kernel: tapb2166113-b0: entered promiscuous mode
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.208 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:00 np0005592767 NetworkManager[54973]: <info>  [1769122560.2094] manager: (tapb2166113-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.210 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2166113-b0, col_values=(('external_ids', {'iface-id': 'aa5b611d-1e09-444a-a393-65b4b53532dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.212 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:00 np0005592767 ovn_controller[94769]: 2026-01-22T22:56:00Z|00782|binding|INFO|Releasing lport aa5b611d-1e09-444a-a393-65b4b53532dd from this chassis (sb_readonly=0)
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.227 182627 DEBUG nova.compute.manager [req-99ea1aff-70dc-4374-9cfa-763047587250 req-c22c6ad2-f665-4c43-b340-3fb94002c053 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.228 182627 DEBUG oslo_concurrency.lockutils [req-99ea1aff-70dc-4374-9cfa-763047587250 req-c22c6ad2-f665-4c43-b340-3fb94002c053 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.228 182627 DEBUG oslo_concurrency.lockutils [req-99ea1aff-70dc-4374-9cfa-763047587250 req-c22c6ad2-f665-4c43-b340-3fb94002c053 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.229 182627 DEBUG oslo_concurrency.lockutils [req-99ea1aff-70dc-4374-9cfa-763047587250 req-c22c6ad2-f665-4c43-b340-3fb94002c053 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.229 182627 DEBUG nova.compute.manager [req-99ea1aff-70dc-4374-9cfa-763047587250 req-c22c6ad2-f665-4c43-b340-3fb94002c053 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Processing event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.232 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.233 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2166113-b582-4acb-94d4-cca8c2589ce5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2166113-b582-4acb-94d4-cca8c2589ce5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.234 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4199e87e-10bb-4ed5-9826-40a75ee701bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.235 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-b2166113-b582-4acb-94d4-cca8c2589ce5
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/b2166113-b582-4acb-94d4-cca8c2589ce5.pid.haproxy
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID b2166113-b582-4acb-94d4-cca8c2589ce5
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:56:00 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:00.236 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5', 'env', 'PROCESS_TAG=haproxy-b2166113-b582-4acb-94d4-cca8c2589ce5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2166113-b582-4acb-94d4-cca8c2589ce5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:56:00 np0005592767 podman[240306]: 2026-01-22 22:56:00.592827312 +0000 UTC m=+0.022678143 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.803 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122560.8025033, 6f5c0f9b-42bb-425f-aae2-cad89722d21a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.803 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] VM Started (Lifecycle Event)#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.806 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.809 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.813 182627 INFO nova.virt.libvirt.driver [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Instance spawned successfully.#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.814 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.873 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:56:00 np0005592767 nova_compute[182623]: 2026-01-22 22:56:00.877 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.000 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.001 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.002 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.002 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.003 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.004 182627 DEBUG nova.virt.libvirt.driver [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.008 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.009 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122560.8026648, 6f5c0f9b-42bb-425f-aae2-cad89722d21a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.009 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.139 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.144 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122560.8081145, 6f5c0f9b-42bb-425f-aae2-cad89722d21a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.145 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.385 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.388 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:56:01 np0005592767 podman[240306]: 2026-01-22 22:56:01.541274834 +0000 UTC m=+0.971125635 container create 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.731 182627 INFO nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Took 6.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.732 182627 DEBUG nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:56:01 np0005592767 nova_compute[182623]: 2026-01-22 22:56:01.734 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:56:01 np0005592767 systemd[1]: Started libpod-conmon-4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5.scope.
Jan 22 17:56:01 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:56:01 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57db18a13d451b0f83d6f66c378b0acf158b8ee44212db66708081832e0a8fe1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:56:01 np0005592767 podman[240306]: 2026-01-22 22:56:01.914062609 +0000 UTC m=+1.343913410 container init 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:56:01 np0005592767 podman[240306]: 2026-01-22 22:56:01.924171035 +0000 UTC m=+1.354021826 container start 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:56:01 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [NOTICE]   (240332) : New worker (240334) forked
Jan 22 17:56:01 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [NOTICE]   (240332) : Loading success.
Jan 22 17:56:02 np0005592767 nova_compute[182623]: 2026-01-22 22:56:02.059 182627 INFO nova.compute.manager [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Took 7.23 seconds to build instance.#033[00m
Jan 22 17:56:02 np0005592767 nova_compute[182623]: 2026-01-22 22:56:02.087 182627 DEBUG oslo_concurrency.lockutils [None req-cee546df-4deb-48f8-9f07-fcf0e593fca8 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:02 np0005592767 nova_compute[182623]: 2026-01-22 22:56:02.254 182627 DEBUG nova.network.neutron [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updated VIF entry in instance network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:56:02 np0005592767 nova_compute[182623]: 2026-01-22 22:56:02.254 182627 DEBUG nova.network.neutron [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:56:02 np0005592767 nova_compute[182623]: 2026-01-22 22:56:02.330 182627 DEBUG oslo_concurrency.lockutils [req-8add59eb-f0f9-4464-8bdd-465a6b25acbb req-9e5264c9-d211-4457-9d2e-da6b2b5bfc9e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.241 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.874 182627 DEBUG nova.compute.manager [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.874 182627 DEBUG oslo_concurrency.lockutils [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.875 182627 DEBUG oslo_concurrency.lockutils [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.875 182627 DEBUG oslo_concurrency.lockutils [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.875 182627 DEBUG nova.compute.manager [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] No waiting events found dispatching network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:56:03 np0005592767 nova_compute[182623]: 2026-01-22 22:56:03.875 182627 WARNING nova.compute.manager [req-b9898e3f-b5c8-4c00-9d47-6b1890ceecc9 req-45969c31-40d3-4969-a2c2-0bb559dbbebc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received unexpected event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 for instance with vm_state active and task_state None.#033[00m
Jan 22 17:56:04 np0005592767 nova_compute[182623]: 2026-01-22 22:56:04.092 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:05 np0005592767 nova_compute[182623]: 2026-01-22 22:56:05.671 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:05 np0005592767 NetworkManager[54973]: <info>  [1769122565.6717] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Jan 22 17:56:05 np0005592767 NetworkManager[54973]: <info>  [1769122565.6725] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 22 17:56:05 np0005592767 nova_compute[182623]: 2026-01-22 22:56:05.726 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:05 np0005592767 ovn_controller[94769]: 2026-01-22T22:56:05Z|00783|binding|INFO|Releasing lport aa5b611d-1e09-444a-a393-65b4b53532dd from this chassis (sb_readonly=0)
Jan 22 17:56:05 np0005592767 nova_compute[182623]: 2026-01-22 22:56:05.741 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:06 np0005592767 nova_compute[182623]: 2026-01-22 22:56:06.188 182627 DEBUG nova.compute.manager [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:56:06 np0005592767 nova_compute[182623]: 2026-01-22 22:56:06.189 182627 DEBUG nova.compute.manager [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing instance network info cache due to event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:56:06 np0005592767 nova_compute[182623]: 2026-01-22 22:56:06.189 182627 DEBUG oslo_concurrency.lockutils [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:56:06 np0005592767 nova_compute[182623]: 2026-01-22 22:56:06.190 182627 DEBUG oslo_concurrency.lockutils [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:56:06 np0005592767 nova_compute[182623]: 2026-01-22 22:56:06.190 182627 DEBUG nova.network.neutron [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:56:07 np0005592767 podman[240344]: 2026-01-22 22:56:07.139992253 +0000 UTC m=+0.057674844 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 17:56:07 np0005592767 podman[240345]: 2026-01-22 22:56:07.17485828 +0000 UTC m=+0.080219082 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.332 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b4', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'user_id': '57cadc74575048b298f2ab431b92531e', 'hostId': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.333 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.348 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.350 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff563067-5cf5-4b61-9831-1e10650aeb75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.333729', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '896a3ebc-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': 'e850c224e59b43649302af8b6a8600d2f3ecc9811462512e9cdbdfba091be6cd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.333729', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '896a5b04-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': '866e6dd3f086c2990a548e7b6587cb5ded00ef09790756922155ec2ea3f98f05'}]}, 'timestamp': '2026-01-22 22:56:07.350866', '_unique_id': '924d6746703648a1829793e241271834'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.354 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.388 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.389 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbc9bc9a-6bcd-43ab-abc8-79e155bdda09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.356236', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89702f84-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '6611ccc8f4d6d800af26ae095a180ff5d60b58b3681fc8e40fa37266947ec2c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.356236', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8970469a-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': 'dd57904fd6b914bf4fb91cd321ab3f09f7880995986630876602ac218673a64a'}]}, 'timestamp': '2026-01-22 22:56:07.389595', '_unique_id': '221c538eb4c4486889fa4361b7fe3fdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.392 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.393 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c72565dd-4dce-469c-8c1b-1976a16ab9c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.392798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8970d7ea-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': '4f6617f18e38cc4152494b5eb8a8ab5a18301b11f5000ddd829a3e1896dd20c5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.392798', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8970efe6-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': 'c5f93b2b6d7cefdf52219754ae315cdc258de4809bc987b8cf41d44ce79222f4'}]}, 'timestamp': '2026-01-22 22:56:07.393965', '_unique_id': '7c65fc59b60843b88e1ad08efec33136'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.396 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.401 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6f5c0f9b-42bb-425f-aae2-cad89722d21a / tap4cd35b77-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.401 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '191581da-7922-41c9-b7f9-39657b600d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.397098', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '897244ae-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': 'e6305c98a6788a3358b112f54822873850dddc558ec695572b1bbb4d0086b24a'}]}, 'timestamp': '2026-01-22 22:56:07.402763', '_unique_id': '78b07879f6d24a4aa308e1a5618fef27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.405 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a47fda2f-5465-4d20-9061-923117e74995', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.405877', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '8972d77a-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '0f010754b3884140e05242c101b73be7d630b258440c1e16ed8b7ce45ec67c79'}]}, 'timestamp': '2026-01-22 22:56:07.406468', '_unique_id': 'd21e66b6c49d46d68ed0aafd6d736ded'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.409 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.409 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b60ae42-61f2-472a-9a02-c418ebee2070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.409137', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89735718-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': 'a70f9ada558ab48355c8f3616f22981616ceba5b2ad394ff25e17f7b91e0c38f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.409137', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '897368b6-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.968659571, 'message_signature': '0da9dc45db4e361cfa9cf3bb8f42400930141248c35cdad03810b7a2ece2dd7e'}]}, 'timestamp': '2026-01-22 22:56:07.410098', '_unique_id': 'c3060c94749f4200a9e40618ec2cd580'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.412 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.428 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/cpu volume: 6350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83c9e3bd-1a98-4986-b20e-ddd20a741e42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6350000000, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'timestamp': '2026-01-22T22:56:07.412807', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '89764658-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.063054094, 'message_signature': 'cd415beb0a49093ab01675e2700961a38789bdec9fb96f6d230ad4251bab9c6d'}]}, 'timestamp': '2026-01-22 22:56:07.428889', '_unique_id': 'fc0cd903cf1e4e7191ae796cba33b6bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.431 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.431 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a1d2faf-b590-436a-8375-57ee7b07d4f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.431070', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8976abe8-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': 'c3a0ed6ff5273ee8588304da8f3ef6ec1099da98a190816428de6ddd98e2fdc5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.431070', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8976b7fa-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': 'b1e7f9e8fc62dba301f45e5b61cc5be112f4212cabdb69210c3cb2adaa250b7c'}]}, 'timestamp': '2026-01-22 22:56:07.431719', '_unique_id': 'e996bf8e6a464d40913fb9f1d39c7fce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.433 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd668a463-1f92-4ce1-9782-6375477f896e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.433565', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '89770c96-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '14c078f82b42d9b0fdc1249a8f4bfe068695ef591c962ce391d4afe4610d80c9'}]}, 'timestamp': '2026-01-22 22:56:07.433908', '_unique_id': 'be2b681adf794045aa5368276428f6b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.434 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.435 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.435 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.435 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 6f5c0f9b-42bb-425f-aae2-cad89722d21a: ceilometer.compute.pollsters.NoVolumeException
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.435 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.435 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>]
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>]
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.436 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fdf01b-1f20-4b36-b797-f17e9043a23f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.436821', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '89778bbc-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '6b9351f1ae54e28011287be7e8c26b2213049e02d56c520224235e5be4482d62'}]}, 'timestamp': '2026-01-22 22:56:07.437163', '_unique_id': 'ce39c226894b49e998cf07f309d98c8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.438 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.438 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.latency volume: 158629226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.439 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.latency volume: 498504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40aca098-26dd-428c-a9f8-66de43878f27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 158629226, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.438840', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8977da68-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '9a8ed53dc1e0ad7c1a881a78446fa2048f31fe3b350749b5a025a537577e9a4e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 498504, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.438840', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8977e756-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '723404c26e4bc3befacf82f324650e182e03608bb900b33434f73c44d4646090'}]}, 'timestamp': '2026-01-22 22:56:07.439485', '_unique_id': '911f73968d9349cbad7fb5d8f3efd27f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.440 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.441 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.441 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae930299-a69b-49ca-b6ba-bdd3d281c804', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.441295', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89783a44-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '5b9fb263ed8e191cade18723104510cec9b9c50d0c7846a8e0e3be6fed18e8a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.441295', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '897845c0-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': 'bacfaa7b0fc12fe642434642ac72966b06f871161479325e3db05f31b80e4fd6'}]}, 'timestamp': '2026-01-22 22:56:07.441901', '_unique_id': 'cc904340481444f69150bdc5a5015f17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.442 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.443 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a11e4cd6-8618-42b9-826a-66f6fddf316e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.443597', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '89789458-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': 'e227cd377b0932d908f01337ca9cb04cf0064c53be71a73b24171ec0cb21e907'}]}, 'timestamp': '2026-01-22 22:56:07.443937', '_unique_id': 'bc89f52ea0e64934b5cacc0daa22f615'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.444 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.445 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.445 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1474874f-0835-4c11-a009-fa9756b19ee9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.445585', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '8978e1d8-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '996e828d6c2629323185cbd63caa32011419176f4ea5da1c7300a26569b961d1'}]}, 'timestamp': '2026-01-22 22:56:07.445917', '_unique_id': '94a8137f1cc646b59c6095e9daa7821c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.446 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.447 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.447 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda151b1-357b-436d-bb2c-8ffe5eedb913', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.447526', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89792d82-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '7f1414198265c2061b9be105b2c970d93cbbbe2a8da1fd689bc5ca2856da2fb3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.447526', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8979394e-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '03e9587c62f5719b618efb4c45ad7f9faade4acfe06f49840496b056ec630227'}]}, 'timestamp': '2026-01-22 22:56:07.448142', '_unique_id': '567ceaf0788d42a7af7e6b542b8fdb92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.448 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.449 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3755bfbf-856e-4272-924b-841a2e42375b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.449774', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '897985a2-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': 'df63ef0af25c8c52aefa4c43aaa706cc1aa15eea86ce0e645ac128f9d713f569'}]}, 'timestamp': '2026-01-22 22:56:07.450113', '_unique_id': 'a0d8136e5458479e9363abb63d53a3fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.450 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.451 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.451 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>]
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.452 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '300dbe37-abf4-48c2-a3de-3a4adad8b604', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.452161', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '8979e3f8-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': 'e60464cf54d81f65cf7360e61fa4744078f0770f0709cff5311e0648b31970d1'}]}, 'timestamp': '2026-01-22 22:56:07.452528', '_unique_id': '77614ef531124fea8fea0f80648ac139'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.453 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.454 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.454 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059>]
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.454 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.455 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2792f6f1-4807-4b91-a4a0-f52e7c274d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-vda', 'timestamp': '2026-01-22T22:56:07.454712', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '897a4794-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '43b7acde6bdae06677561c8adca4c73291654d86ba79ff87efa5e7ce338a19ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a-sda', 'timestamp': '2026-01-22T22:56:07.454712', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'instance-000000b4', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '897a53e2-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6163.991157908, 'message_signature': '88dab0b930a23730e6d804020510a2984cdda17bd93531b4a943571c25b3cd74'}]}, 'timestamp': '2026-01-22 22:56:07.455369', '_unique_id': '221da08214ab468f8fce635673179675'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.457 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c5ccda9-ce3d-451d-bf23-1520454ef322', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.456999', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '897a9fc8-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '8df927bb1952f0df0a7da5d6a623a0f0884028df8f28684de83fa2ed21baa5ab'}]}, 'timestamp': '2026-01-22 22:56:07.457360', '_unique_id': '52476197dfda4343a6c9e1522317167c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.458 12 DEBUG ceilometer.compute.pollsters [-] 6f5c0f9b-42bb-425f-aae2-cad89722d21a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc4b7c2-c212-42f2-a8b9-5bc5668b7dd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57cadc74575048b298f2ab431b92531e', 'user_name': None, 'project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'project_name': None, 'resource_id': 'instance-000000b4-6f5c0f9b-42bb-425f-aae2-cad89722d21a-tap4cd35b77-3e', 'timestamp': '2026-01-22T22:56:07.458958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059', 'name': 'tap4cd35b77-3e', 'instance_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'instance_type': 'm1.nano', 'host': '86195ca671440fbead9dd4099f0839eb64d05bc474f887b4e76d98fa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '63b0d901-60c2-48cb-afeb-72a71e897d3d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}, 'image_ref': '48dd0ec8-2856-44d4-b286-44fdc64ba78d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:14:58:50', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4cd35b77-3e'}, 'message_id': '897aec12-f7e5-11f0-a43a-fa163ed01feb', 'monotonic_time': 6164.032045546, 'message_signature': '68fd86f4db7eaa9d179a2378794869bfd77dc6d900af68b32263e895733dd172'}]}, 'timestamp': '2026-01-22 22:56:07.459312', '_unique_id': 'f258de7583a243a4aaa02775309b3d22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 17:56:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:56:07.459 12 ERROR oslo_messaging.notify.messaging 
Jan 22 17:56:08 np0005592767 nova_compute[182623]: 2026-01-22 22:56:08.042 182627 DEBUG nova.network.neutron [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updated VIF entry in instance network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:56:08 np0005592767 nova_compute[182623]: 2026-01-22 22:56:08.042 182627 DEBUG nova.network.neutron [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:56:08 np0005592767 nova_compute[182623]: 2026-01-22 22:56:08.245 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:08 np0005592767 nova_compute[182623]: 2026-01-22 22:56:08.343 182627 DEBUG oslo_concurrency.lockutils [req-ab0b3cca-31d0-4d08-8184-8d8d0886947d req-29ca8cf4-87e1-47da-8a5b-948160056d13 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:56:09 np0005592767 nova_compute[182623]: 2026-01-22 22:56:09.095 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:11 np0005592767 ovn_controller[94769]: 2026-01-22T22:56:11Z|00784|binding|INFO|Releasing lport aa5b611d-1e09-444a-a393-65b4b53532dd from this chassis (sb_readonly=0)
Jan 22 17:56:11 np0005592767 nova_compute[182623]: 2026-01-22 22:56:11.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:12.126 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:56:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:12.127 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:56:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:12.128 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:13 np0005592767 podman[240402]: 2026-01-22 22:56:13.143150693 +0000 UTC m=+0.063062667 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:56:13 np0005592767 nova_compute[182623]: 2026-01-22 22:56:13.248 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:56:13Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:58:50 10.100.0.5
Jan 22 17:56:13 np0005592767 ovn_controller[94769]: 2026-01-22T22:56:13Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:58:50 10.100.0.5
Jan 22 17:56:14 np0005592767 nova_compute[182623]: 2026-01-22 22:56:14.098 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:18 np0005592767 nova_compute[182623]: 2026-01-22 22:56:18.250 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:19 np0005592767 nova_compute[182623]: 2026-01-22 22:56:19.101 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:23 np0005592767 nova_compute[182623]: 2026-01-22 22:56:23.253 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:24 np0005592767 nova_compute[182623]: 2026-01-22 22:56:24.148 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:24 np0005592767 nova_compute[182623]: 2026-01-22 22:56:24.672 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:25 np0005592767 podman[240426]: 2026-01-22 22:56:25.175420003 +0000 UTC m=+0.087119647 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:56:28 np0005592767 nova_compute[182623]: 2026-01-22 22:56:28.255 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:29 np0005592767 nova_compute[182623]: 2026-01-22 22:56:29.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:30 np0005592767 nova_compute[182623]: 2026-01-22 22:56:30.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:30 np0005592767 podman[240447]: 2026-01-22 22:56:30.182084729 +0000 UTC m=+0.100384224 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Jan 22 17:56:30 np0005592767 podman[240446]: 2026-01-22 22:56:30.199289176 +0000 UTC m=+0.112175997 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 17:56:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:32.068 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:87:28 10.100.0.2 2001:db8::f816:3eff:fe34:8728'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe34:8728/64', 'neutron:device_id': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7e638a4c-ee74-4c71-b2dd-f7bbad609134) old=Port_Binding(mac=['fa:16:3e:34:87:28 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:56:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:32.069 104135 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7e638a4c-ee74-4c71-b2dd-f7bbad609134 in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d updated#033[00m
Jan 22 17:56:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:32.070 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8d77c31-420b-47d9-87ac-6c37fe7e216d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:56:32 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:32.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[23620739-16dc-429f-af17-7bceebb79489]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:56:33 np0005592767 nova_compute[182623]: 2026-01-22 22:56:33.257 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:34 np0005592767 nova_compute[182623]: 2026-01-22 22:56:34.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:36 np0005592767 nova_compute[182623]: 2026-01-22 22:56:36.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:36 np0005592767 nova_compute[182623]: 2026-01-22 22:56:36.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:56:36 np0005592767 nova_compute[182623]: 2026-01-22 22:56:36.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:56:37 np0005592767 nova_compute[182623]: 2026-01-22 22:56:37.052 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:56:37 np0005592767 nova_compute[182623]: 2026-01-22 22:56:37.053 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:56:37 np0005592767 nova_compute[182623]: 2026-01-22 22:56:37.054 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:56:37 np0005592767 nova_compute[182623]: 2026-01-22 22:56:37.054 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6f5c0f9b-42bb-425f-aae2-cad89722d21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:56:38 np0005592767 podman[240492]: 2026-01-22 22:56:38.141491272 +0000 UTC m=+0.058099466 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 17:56:38 np0005592767 podman[240493]: 2026-01-22 22:56:38.148915782 +0000 UTC m=+0.059767023 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:56:38 np0005592767 nova_compute[182623]: 2026-01-22 22:56:38.308 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:39 np0005592767 nova_compute[182623]: 2026-01-22 22:56:39.155 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.339 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.359 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.360 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.360 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:40.732 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:56:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:40.734 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:56:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:56:40.736 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:56:40 np0005592767 nova_compute[182623]: 2026-01-22 22:56:40.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.926 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:56:41 np0005592767 nova_compute[182623]: 2026-01-22 22:56:41.997 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.088 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.090 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.186 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.395 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.396 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5548MB free_disk=73.0221061706543GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.397 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.397 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.472 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 6f5c0f9b-42bb-425f-aae2-cad89722d21a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.473 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.474 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.486 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.501 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.501 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.518 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.543 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.580 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.601 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.629 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:56:42 np0005592767 nova_compute[182623]: 2026-01-22 22:56:42.629 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:56:43 np0005592767 nova_compute[182623]: 2026-01-22 22:56:43.309 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:44 np0005592767 podman[240540]: 2026-01-22 22:56:44.153413269 +0000 UTC m=+0.065249759 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:56:44 np0005592767 nova_compute[182623]: 2026-01-22 22:56:44.157 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:44 np0005592767 nova_compute[182623]: 2026-01-22 22:56:44.628 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:47 np0005592767 nova_compute[182623]: 2026-01-22 22:56:47.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:47 np0005592767 nova_compute[182623]: 2026-01-22 22:56:47.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:56:48 np0005592767 nova_compute[182623]: 2026-01-22 22:56:48.312 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:48 np0005592767 nova_compute[182623]: 2026-01-22 22:56:48.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:49 np0005592767 nova_compute[182623]: 2026-01-22 22:56:49.159 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:53 np0005592767 nova_compute[182623]: 2026-01-22 22:56:53.368 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:54 np0005592767 nova_compute[182623]: 2026-01-22 22:56:54.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:56 np0005592767 podman[240566]: 2026-01-22 22:56:56.139950019 +0000 UTC m=+0.060342640 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 17:56:56 np0005592767 nova_compute[182623]: 2026-01-22 22:56:56.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:56:58 np0005592767 nova_compute[182623]: 2026-01-22 22:56:58.371 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:56:59 np0005592767 nova_compute[182623]: 2026-01-22 22:56:59.164 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:01 np0005592767 podman[240588]: 2026-01-22 22:57:01.177343564 +0000 UTC m=+0.088678822 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 22 17:57:01 np0005592767 podman[240587]: 2026-01-22 22:57:01.177589381 +0000 UTC m=+0.094636470 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.374 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.721 182627 DEBUG nova.compute.manager [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.722 182627 DEBUG nova.compute.manager [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing instance network info cache due to event network-changed-4cd35b77-3e61-4cda-8aaf-331882c21f75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.722 182627 DEBUG oslo_concurrency.lockutils [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.723 182627 DEBUG oslo_concurrency.lockutils [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:03 np0005592767 nova_compute[182623]: 2026-01-22 22:57:03.723 182627 DEBUG nova.network.neutron [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Refreshing network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.165 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.292 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.292 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.293 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.293 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.293 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.311 182627 INFO nova.compute.manager [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Terminating instance#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.325 182627 DEBUG nova.compute.manager [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:57:04 np0005592767 kernel: tap4cd35b77-3e (unregistering): left promiscuous mode
Jan 22 17:57:04 np0005592767 NetworkManager[54973]: <info>  [1769122624.3499] device (tap4cd35b77-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:57:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:04Z|00785|binding|INFO|Releasing lport 4cd35b77-3e61-4cda-8aaf-331882c21f75 from this chassis (sb_readonly=0)
Jan 22 17:57:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:04Z|00786|binding|INFO|Setting lport 4cd35b77-3e61-4cda-8aaf-331882c21f75 down in Southbound
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.355 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:04Z|00787|binding|INFO|Removing iface tap4cd35b77-3e ovn-installed in OVS
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.357 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:04.367 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:58:50 10.100.0.5'], port_security=['fa:16:3e:14:58:50 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6f5c0f9b-42bb-425f-aae2-cad89722d21a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2166113-b582-4acb-94d4-cca8c2589ce5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbcf23c8115e43a0af378f72b41c2f1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '800afde8-d00b-4348-b69b-b46974950191 e084b098-262d-45bd-97a6-6f8d267bf424', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2d1b52b-e43f-4c02-8d8d-bb5591e6462d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=4cd35b77-3e61-4cda-8aaf-331882c21f75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:57:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:04.368 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 4cd35b77-3e61-4cda-8aaf-331882c21f75 in datapath b2166113-b582-4acb-94d4-cca8c2589ce5 unbound from our chassis#033[00m
Jan 22 17:57:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:04.370 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2166113-b582-4acb-94d4-cca8c2589ce5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:57:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:04.371 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6a9342-75a0-4c28-98db-a3b47fa69858]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:04 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:04.372 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5 namespace which is not needed anymore#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.376 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 22 17:57:04 np0005592767 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000b4.scope: Consumed 15.224s CPU time.
Jan 22 17:57:04 np0005592767 systemd-machined[153912]: Machine qemu-93-instance-000000b4 terminated.
Jan 22 17:57:04 np0005592767 kernel: tap4cd35b77-3e: entered promiscuous mode
Jan 22 17:57:04 np0005592767 kernel: tap4cd35b77-3e (unregistering): left promiscuous mode
Jan 22 17:57:04 np0005592767 NetworkManager[54973]: <info>  [1769122624.5489] manager: (tap4cd35b77-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.598 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.632 182627 INFO nova.virt.libvirt.driver [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Instance destroyed successfully.#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.633 182627 DEBUG nova.objects.instance [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lazy-loading 'resources' on Instance uuid 6f5c0f9b-42bb-425f-aae2-cad89722d21a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:57:04 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [NOTICE]   (240332) : haproxy version is 2.8.14-c23fe91
Jan 22 17:57:04 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [NOTICE]   (240332) : path to executable is /usr/sbin/haproxy
Jan 22 17:57:04 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [WARNING]  (240332) : Exiting Master process...
Jan 22 17:57:04 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [ALERT]    (240332) : Current worker (240334) exited with code 143 (Terminated)
Jan 22 17:57:04 np0005592767 neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5[240328]: [WARNING]  (240332) : All workers exited. Exiting... (0)
Jan 22 17:57:04 np0005592767 systemd[1]: libpod-4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5.scope: Deactivated successfully.
Jan 22 17:57:04 np0005592767 podman[240661]: 2026-01-22 22:57:04.717787349 +0000 UTC m=+0.258233193 container died 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.765 182627 DEBUG nova.virt.libvirt.vif [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:55:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-838015615-access_point-2015078059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-838015615-acc',id=180,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD84xZdPIFoxp03M+qy8ubX2yjnPVx4s19faDbEEd2GlXdkDQvPT5ly+rGsLhEu7hsGSTeKn0265dQ5QI8wo2Iew2eNOe20T1gxb3IbUJadj9xf8ub5lKfCQgVpfE1xbBw==',key_name='tempest-TestSecurityGroupsBasicOps-1433766663',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:56:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bbcf23c8115e43a0af378f72b41c2f1b',ramdisk_id='',reservation_id='r-hoe0ttwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-838015615',owner_user_name='tempest-TestSecurityGroupsBasicOps-838015615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:56:01Z,user_data=None,user_id='57cadc74575048b298f2ab431b92531e',uuid=6f5c0f9b-42bb-425f-aae2-cad89722d21a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.766 182627 DEBUG nova.network.os_vif_util [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converting VIF {"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.766 182627 DEBUG nova.network.os_vif_util [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.766 182627 DEBUG os_vif [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.768 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.768 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4cd35b77-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.772 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.772 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.774 182627 INFO os_vif [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:58:50,bridge_name='br-int',has_traffic_filtering=True,id=4cd35b77-3e61-4cda-8aaf-331882c21f75,network=Network(b2166113-b582-4acb-94d4-cca8c2589ce5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4cd35b77-3e')#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.775 182627 INFO nova.virt.libvirt.driver [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Deleting instance files /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a_del#033[00m
Jan 22 17:57:04 np0005592767 nova_compute[182623]: 2026-01-22 22:57:04.776 182627 INFO nova.virt.libvirt.driver [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Deletion of /var/lib/nova/instances/6f5c0f9b-42bb-425f-aae2-cad89722d21a_del complete#033[00m
Jan 22 17:57:05 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5-userdata-shm.mount: Deactivated successfully.
Jan 22 17:57:05 np0005592767 systemd[1]: var-lib-containers-storage-overlay-57db18a13d451b0f83d6f66c378b0acf158b8ee44212db66708081832e0a8fe1-merged.mount: Deactivated successfully.
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.259 182627 INFO nova.compute.manager [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Took 0.93 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.260 182627 DEBUG oslo.service.loopingcall [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.260 182627 DEBUG nova.compute.manager [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.261 182627 DEBUG nova.network.neutron [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:57:05 np0005592767 podman[240661]: 2026-01-22 22:57:05.35467618 +0000 UTC m=+0.895122004 container cleanup 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:57:05 np0005592767 systemd[1]: libpod-conmon-4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5.scope: Deactivated successfully.
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.587 182627 DEBUG nova.network.neutron [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updated VIF entry in instance network info cache for port 4cd35b77-3e61-4cda-8aaf-331882c21f75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.588 182627 DEBUG nova.network.neutron [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [{"id": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "address": "fa:16:3e:14:58:50", "network": {"id": "b2166113-b582-4acb-94d4-cca8c2589ce5", "bridge": "br-int", "label": "tempest-network-smoke--1599425952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bbcf23c8115e43a0af378f72b41c2f1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4cd35b77-3e", "ovs_interfaceid": "4cd35b77-3e61-4cda-8aaf-331882c21f75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.652 182627 DEBUG nova.compute.manager [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-unplugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.652 182627 DEBUG oslo_concurrency.lockutils [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.652 182627 DEBUG oslo_concurrency.lockutils [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.653 182627 DEBUG oslo_concurrency.lockutils [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.653 182627 DEBUG nova.compute.manager [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] No waiting events found dispatching network-vif-unplugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.653 182627 DEBUG nova.compute.manager [req-a4830f83-40e9-4231-9187-e25a81cd9a86 req-8119c8f2-ccb3-41a5-bfbe-fbbcc588693e 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-unplugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:57:05 np0005592767 nova_compute[182623]: 2026-01-22 22:57:05.946 182627 DEBUG oslo_concurrency.lockutils [req-a082b5c1-0e9f-413c-9d9a-fc4e4c5fd228 req-1c302f43-d7ea-4e45-b9cf-0595485c2694 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-6f5c0f9b-42bb-425f-aae2-cad89722d21a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:06 np0005592767 podman[240707]: 2026-01-22 22:57:06.063263142 +0000 UTC m=+0.677540494 container remove 4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.069 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[661c1ddc-85bc-4b12-87a2-f39945ec9cc0]: (4, ('Thu Jan 22 10:57:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5 (4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5)\n4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5\nThu Jan 22 10:57:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5 (4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5)\n4af0e0fd6b16261f3b9be920e29e3896049e4139ca2253b9ee8056ce00dd6db5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[252b45d4-abde-42c1-8b8c-ea4a7be1d9b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.072 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2166113-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:06 np0005592767 kernel: tapb2166113-b0: left promiscuous mode
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.079 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18c56dd7-f1a7-44a3-b570-139bafe8d528]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.090 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.105 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[79755203-4547-4f63-8c9f-fe06c71bbf24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.107 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5afb69fe-259d-42ee-94f8-f164e2403a74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.135 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[1215119c-a8b0-4b21-ae4f-400b6528b3bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615648, 'reachable_time': 25060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240723, 'error': None, 'target': 'ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 systemd[1]: run-netns-ovnmeta\x2db2166113\x2db582\x2d4acb\x2d94d4\x2dcca8c2589ce5.mount: Deactivated successfully.
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.138 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2166113-b582-4acb-94d4-cca8c2589ce5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:57:06 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:06.139 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[6c51ccaa-20d8-45cb-9631-cbd99154a5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.631 182627 DEBUG nova.network.neutron [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.654 182627 INFO nova.compute.manager [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Took 1.39 seconds to deallocate network for instance.#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.708 182627 DEBUG nova.compute.manager [req-97e0b6db-131a-4a59-8fe4-dfe2bf958da4 req-53d14651-9de4-4d80-b6cb-bf9f343c03b2 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-deleted-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.736 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.737 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.785 182627 DEBUG nova.compute.provider_tree [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.799 182627 DEBUG nova.scheduler.client.report [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.815 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:06 np0005592767 nova_compute[182623]: 2026-01-22 22:57:06.836 182627 INFO nova.scheduler.client.report [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Deleted allocations for instance 6f5c0f9b-42bb-425f-aae2-cad89722d21a#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.678 182627 DEBUG oslo_concurrency.lockutils [None req-fbd40e7f-45f8-412c-9fc8-a8dfbb0162ee 57cadc74575048b298f2ab431b92531e bbcf23c8115e43a0af378f72b41c2f1b - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.748 182627 DEBUG nova.compute.manager [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.748 182627 DEBUG oslo_concurrency.lockutils [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.748 182627 DEBUG oslo_concurrency.lockutils [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.749 182627 DEBUG oslo_concurrency.lockutils [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "6f5c0f9b-42bb-425f-aae2-cad89722d21a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.749 182627 DEBUG nova.compute.manager [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] No waiting events found dispatching network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:57:07 np0005592767 nova_compute[182623]: 2026-01-22 22:57:07.749 182627 WARNING nova.compute.manager [req-7887f971-5896-41a0-a994-2f535efe31fe req-6e3d5c63-5f85-46b5-a359-5e4081924114 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Received unexpected event network-vif-plugged-4cd35b77-3e61-4cda-8aaf-331882c21f75 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:57:08 np0005592767 nova_compute[182623]: 2026-01-22 22:57:08.375 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:09 np0005592767 podman[240724]: 2026-01-22 22:57:09.129738449 +0000 UTC m=+0.048987358 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 22 17:57:09 np0005592767 podman[240725]: 2026-01-22 22:57:09.129948705 +0000 UTC m=+0.047062914 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 17:57:09 np0005592767 nova_compute[182623]: 2026-01-22 22:57:09.771 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.033 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.034 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.055 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.138 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.139 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.146 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.147 182627 INFO nova.compute.claims [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.733 182627 DEBUG nova.compute.provider_tree [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.959 182627 DEBUG nova.scheduler.client.report [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.984 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:11 np0005592767 nova_compute[182623]: 2026-01-22 22:57:11.985 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.043 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.044 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.057 182627 INFO nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.071 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:57:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:12.127 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:12.127 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:12.127 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.164 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.166 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.167 182627 INFO nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Creating image(s)#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.168 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.169 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.170 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.195 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.243 182627 DEBUG nova.policy [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17723e69e2af4d3d9c5837bae2a0ad5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61f6867826994602937cf08774d215cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.257 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.258 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.259 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.269 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.322 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.323 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.360 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.361 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.362 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.388 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.446 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.447 182627 DEBUG nova.virt.disk.api [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Checking if we can resize image /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.447 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.501 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.513 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.514 182627 DEBUG nova.virt.disk.api [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Cannot resize image /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.514 182627 DEBUG nova.objects.instance [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'migration_context' on Instance uuid 625dab51-0d70-4c53-9794-741a3c4ccfc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.534 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.534 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Ensure instance console log exists: /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.535 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.535 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:12 np0005592767 nova_compute[182623]: 2026-01-22 22:57:12.535 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:13 np0005592767 nova_compute[182623]: 2026-01-22 22:57:13.406 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:13 np0005592767 nova_compute[182623]: 2026-01-22 22:57:13.729 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Successfully created port: ad58676d-c707-42ac-95cb-bafdf2aa7e4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:57:14 np0005592767 nova_compute[182623]: 2026-01-22 22:57:14.774 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:15 np0005592767 podman[240783]: 2026-01-22 22:57:15.158243045 +0000 UTC m=+0.070833177 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.524 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Successfully updated port: ad58676d-c707-42ac-95cb-bafdf2aa7e4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.544 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.544 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquired lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.545 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.670 182627 DEBUG nova.compute.manager [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.671 182627 DEBUG nova.compute.manager [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing instance network info cache due to event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.671 182627 DEBUG oslo_concurrency.lockutils [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:15 np0005592767 nova_compute[182623]: 2026-01-22 22:57:15.769 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.935 182627 DEBUG nova.network.neutron [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.957 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Releasing lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.958 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Instance network_info: |[{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.958 182627 DEBUG oslo_concurrency.lockutils [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.959 182627 DEBUG nova.network.neutron [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.962 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Start _get_guest_xml network_info=[{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.966 182627 WARNING nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.970 182627 DEBUG nova.virt.libvirt.host [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.971 182627 DEBUG nova.virt.libvirt.host [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.974 182627 DEBUG nova.virt.libvirt.host [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.974 182627 DEBUG nova.virt.libvirt.host [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.975 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.975 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.976 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.976 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.976 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.976 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.977 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.977 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.977 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.977 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.977 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.978 182627 DEBUG nova.virt.hardware [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.981 182627 DEBUG nova.virt.libvirt.vif [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1642431739',display_name='tempest-TestGettingAddress-server-1642431739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1642431739',id=183,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-4xb9s2cl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:57:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=625dab51-0d70-4c53-9794-741a3c4ccfc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.981 182627 DEBUG nova.network.os_vif_util [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.982 182627 DEBUG nova.network.os_vif_util [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.982 182627 DEBUG nova.objects.instance [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'pci_devices' on Instance uuid 625dab51-0d70-4c53-9794-741a3c4ccfc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:57:17 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.996 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <uuid>625dab51-0d70-4c53-9794-741a3c4ccfc0</uuid>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <name>instance-000000b7</name>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestGettingAddress-server-1642431739</nova:name>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:57:17</nova:creationTime>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:user uuid="17723e69e2af4d3d9c5837bae2a0ad5f">tempest-TestGettingAddress-1431418722-project-member</nova:user>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:project uuid="61f6867826994602937cf08774d215cf">tempest-TestGettingAddress-1431418722</nova:project>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        <nova:port uuid="ad58676d-c707-42ac-95cb-bafdf2aa7e4b">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe66:8db3" ipVersion="6"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="serial">625dab51-0d70-4c53-9794-741a3c4ccfc0</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="uuid">625dab51-0d70-4c53-9794-741a3c4ccfc0</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:57:17 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.config"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:66:8d:b3"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <target dev="tapad58676d-c7"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/console.log" append="off"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:57:18 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:57:18 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:57:18 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:57:18 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.998 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Preparing to wait for external event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.998 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.998 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.999 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:17.999 182627 DEBUG nova.virt.libvirt.vif [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1642431739',display_name='tempest-TestGettingAddress-server-1642431739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1642431739',id=183,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-4xb9s2cl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:57:12Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=625dab51-0d70-4c53-9794-741a3c4ccfc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.000 182627 DEBUG nova.network.os_vif_util [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.000 182627 DEBUG nova.network.os_vif_util [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.001 182627 DEBUG os_vif [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.001 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.002 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.002 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.005 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.005 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad58676d-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.006 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapad58676d-c7, col_values=(('external_ids', {'iface-id': 'ad58676d-c707-42ac-95cb-bafdf2aa7e4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:8d:b3', 'vm-uuid': '625dab51-0d70-4c53-9794-741a3c4ccfc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.0091] manager: (tapad58676d-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.008 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.012 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.015 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.015 182627 INFO os_vif [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7')#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.139 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.140 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.141 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] No VIF found with MAC fa:16:3e:66:8d:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.141 182627 INFO nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Using config drive#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.409 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.686 182627 INFO nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Creating config drive at /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.config#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.694 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_e82lteb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.821 182627 DEBUG oslo_concurrency.processutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_e82lteb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:57:18 np0005592767 kernel: tapad58676d-c7: entered promiscuous mode
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.8801] manager: (tapad58676d-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Jan 22 17:57:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:18Z|00788|binding|INFO|Claiming lport ad58676d-c707-42ac-95cb-bafdf2aa7e4b for this chassis.
Jan 22 17:57:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:18Z|00789|binding|INFO|ad58676d-c707-42ac-95cb-bafdf2aa7e4b: Claiming fa:16:3e:66:8d:b3 10.100.0.13 2001:db8::f816:3eff:fe66:8db3
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.882 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.889 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.891 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.8922] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.8930] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.898 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:8d:b3 10.100.0.13 2001:db8::f816:3eff:fe66:8db3'], port_security=['fa:16:3e:66:8d:b3 10.100.0.13 2001:db8::f816:3eff:fe66:8db3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe66:8db3/64', 'neutron:device_id': '625dab51-0d70-4c53-9794-741a3c4ccfc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ed139db-5528-4ba0-9d69-09cd70ed61c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad58676d-c707-42ac-95cb-bafdf2aa7e4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.899 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad58676d-c707-42ac-95cb-bafdf2aa7e4b in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d bound to our chassis#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.900 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8d77c31-420b-47d9-87ac-6c37fe7e216d#033[00m
Jan 22 17:57:18 np0005592767 systemd-udevd[240828]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.910 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[17ee1cce-8d12-4590-9306-f187c19692d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.910 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8d77c31-41 in ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.912 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8d77c31-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.912 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a0952f35-86ba-425c-951d-ea635ef950c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.913 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d527df-1fef-40d0-9b22-b2c47b43f258]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 systemd-machined[153912]: New machine qemu-94-instance-000000b7.
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.9231] device (tapad58676d-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.9235] device (tapad58676d-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.923 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[b9de9164-5cbc-4403-92e1-429261497786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 systemd[1]: Started Virtual Machine qemu-94-instance-000000b7.
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.948 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd8561c-9caf-4580-a619-58e42d1e1c81]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.975 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c04a368c-71a4-464e-be23-9728fa3db9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:18Z|00790|binding|INFO|Setting lport ad58676d-c707-42ac-95cb-bafdf2aa7e4b ovn-installed in OVS
Jan 22 17:57:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:18Z|00791|binding|INFO|Setting lport ad58676d-c707-42ac-95cb-bafdf2aa7e4b up in Southbound
Jan 22 17:57:18 np0005592767 nova_compute[182623]: 2026-01-22 22:57:18.978 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:18.980 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a94d55a1-2ab5-4d33-80ba-c6061136b251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:18 np0005592767 NetworkManager[54973]: <info>  [1769122638.9817] manager: (tapd8d77c31-40): new Veth device (/org/freedesktop/NetworkManager/Devices/378)
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.012 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[7571fdbe-84d0-4b42-8d84-9f3e71dee765]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.014 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[342ccdfa-04f7-4a9e-8331-71768fb788f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 NetworkManager[54973]: <info>  [1769122639.0356] device (tapd8d77c31-40): carrier: link connected
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.042 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[86eff53f-e597-4510-9c5a-a347c3094a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.055 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[348f32a9-5841-40f9-985d-8f0fbf6b76ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8d77c31-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:87:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623561, 'reachable_time': 30743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240860, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.071 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab1dade-ea61-4861-babe-9e5281d655de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe34:8728'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623561, 'tstamp': 623561}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240861, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.087 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa662df-3d5d-4584-8b3d-d5e5ba05fa64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8d77c31-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:34:87:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 239], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623561, 'reachable_time': 30743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240862, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.114 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6964fb-f11d-466a-a0a9-951afb8b454d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.163 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0c9d6c-75b9-4fe7-9369-840e436f3942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.165 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8d77c31-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.165 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.166 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8d77c31-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.167 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:19 np0005592767 kernel: tapd8d77c31-40: entered promiscuous mode
Jan 22 17:57:19 np0005592767 NetworkManager[54973]: <info>  [1769122639.1693] manager: (tapd8d77c31-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.170 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.171 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8d77c31-40, col_values=(('external_ids', {'iface-id': '7e638a4c-ee74-4c71-b2dd-f7bbad609134'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.172 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:19 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:19Z|00792|binding|INFO|Releasing lport 7e638a4c-ee74-4c71-b2dd-f7bbad609134 from this chassis (sb_readonly=0)
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.173 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.173 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.174 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[78f78a11-2a26-47d7-bad9-7a7132956b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.175 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-d8d77c31-420b-47d9-87ac-6c37fe7e216d
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/d8d77c31-420b-47d9-87ac-6c37fe7e216d.pid.haproxy
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID d8d77c31-420b-47d9-87ac-6c37fe7e216d
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:57:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:19.175 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'env', 'PROCESS_TAG=haproxy-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8d77c31-420b-47d9-87ac-6c37fe7e216d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.183 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.280 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122639.280255, 625dab51-0d70-4c53-9794-741a3c4ccfc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.282 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] VM Started (Lifecycle Event)#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.312 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.316 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122639.2813187, 625dab51-0d70-4c53-9794-741a3c4ccfc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.316 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.336 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.340 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.363 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:57:19 np0005592767 podman[240901]: 2026-01-22 22:57:19.487519515 +0000 UTC m=+0.042039901 container create e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 17:57:19 np0005592767 systemd[1]: Started libpod-conmon-e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2.scope.
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.541 182627 DEBUG nova.compute.manager [req-3638a2f9-f2bb-4e91-a09a-15c440a14682 req-c6238022-c422-41fe-81fa-0e1d09e24981 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.541 182627 DEBUG oslo_concurrency.lockutils [req-3638a2f9-f2bb-4e91-a09a-15c440a14682 req-c6238022-c422-41fe-81fa-0e1d09e24981 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.541 182627 DEBUG oslo_concurrency.lockutils [req-3638a2f9-f2bb-4e91-a09a-15c440a14682 req-c6238022-c422-41fe-81fa-0e1d09e24981 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.542 182627 DEBUG oslo_concurrency.lockutils [req-3638a2f9-f2bb-4e91-a09a-15c440a14682 req-c6238022-c422-41fe-81fa-0e1d09e24981 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.542 182627 DEBUG nova.compute.manager [req-3638a2f9-f2bb-4e91-a09a-15c440a14682 req-c6238022-c422-41fe-81fa-0e1d09e24981 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Processing event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.543 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:57:19 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.546 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122639.54545, 625dab51-0d70-4c53-9794-741a3c4ccfc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.546 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.547 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:57:19 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dd0a3c14dc086bd27d5f113689517b9dcaece28d15841cc068c7f8ee16eb003/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.550 182627 INFO nova.virt.libvirt.driver [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Instance spawned successfully.#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.551 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:57:19 np0005592767 podman[240901]: 2026-01-22 22:57:19.464598296 +0000 UTC m=+0.019118682 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:57:19 np0005592767 podman[240901]: 2026-01-22 22:57:19.563120385 +0000 UTC m=+0.117640801 container init e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 17:57:19 np0005592767 podman[240901]: 2026-01-22 22:57:19.568125487 +0000 UTC m=+0.122645873 container start e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.577 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.582 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.582 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.583 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.584 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.584 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.585 182627 DEBUG nova.virt.libvirt.driver [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:57:19 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [NOTICE]   (240920) : New worker (240922) forked
Jan 22 17:57:19 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [NOTICE]   (240920) : Loading success.
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.590 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.631 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122624.6307395, 6f5c0f9b-42bb-425f-aae2-cad89722d21a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.632 182627 INFO nova.compute.manager [-] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.637 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.669 182627 DEBUG nova.compute.manager [None req-dd716222-97a4-4e2f-8629-adda67d3a08a - - - - - -] [instance: 6f5c0f9b-42bb-425f-aae2-cad89722d21a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.687 182627 INFO nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Took 7.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.687 182627 DEBUG nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.784 182627 INFO nova.compute.manager [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Took 8.68 seconds to build instance.#033[00m
Jan 22 17:57:19 np0005592767 nova_compute[182623]: 2026-01-22 22:57:19.807 182627 DEBUG oslo_concurrency.lockutils [None req-91af264b-888e-4523-b196-fabf996844a3 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:20 np0005592767 nova_compute[182623]: 2026-01-22 22:57:20.451 182627 DEBUG nova.network.neutron [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updated VIF entry in instance network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:57:20 np0005592767 nova_compute[182623]: 2026-01-22 22:57:20.452 182627 DEBUG nova.network.neutron [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:20 np0005592767 nova_compute[182623]: 2026-01-22 22:57:20.469 182627 DEBUG oslo_concurrency.lockutils [req-ccfd0fb5-d392-4867-bd5c-f8a68d40b514 req-3668d929-6a4b-4ef6-8005-2f9f284b931c 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.701 182627 DEBUG nova.compute.manager [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.702 182627 DEBUG oslo_concurrency.lockutils [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.702 182627 DEBUG oslo_concurrency.lockutils [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.702 182627 DEBUG oslo_concurrency.lockutils [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.702 182627 DEBUG nova.compute.manager [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] No waiting events found dispatching network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:57:21 np0005592767 nova_compute[182623]: 2026-01-22 22:57:21.703 182627 WARNING nova.compute.manager [req-ec35fb38-ceef-4a13-b821-eb788d8c7c74 req-c14dcfc8-23c8-40c4-89d0-8aae15412674 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received unexpected event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b for instance with vm_state active and task_state None.#033[00m
Jan 22 17:57:23 np0005592767 nova_compute[182623]: 2026-01-22 22:57:23.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:23 np0005592767 nova_compute[182623]: 2026-01-22 22:57:23.429 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:23 np0005592767 nova_compute[182623]: 2026-01-22 22:57:23.434 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.019 182627 DEBUG nova.compute.manager [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.020 182627 DEBUG nova.compute.manager [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing instance network info cache due to event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.021 182627 DEBUG oslo_concurrency.lockutils [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.021 182627 DEBUG oslo_concurrency.lockutils [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.021 182627 DEBUG nova.network.neutron [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:57:24 np0005592767 nova_compute[182623]: 2026-01-22 22:57:24.353 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:24.354 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:57:24 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:24.357 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:57:25 np0005592767 nova_compute[182623]: 2026-01-22 22:57:25.583 182627 DEBUG nova.network.neutron [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updated VIF entry in instance network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:57:25 np0005592767 nova_compute[182623]: 2026-01-22 22:57:25.584 182627 DEBUG nova.network.neutron [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:25 np0005592767 nova_compute[182623]: 2026-01-22 22:57:25.609 182627 DEBUG oslo_concurrency.lockutils [req-85033e04-01ed-40c1-b411-cec70b7b0d4e req-3105a842-7eb3-4760-956b-bc5185b380d1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:25 np0005592767 nova_compute[182623]: 2026-01-22 22:57:25.797 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:27 np0005592767 podman[240932]: 2026-01-22 22:57:27.134832684 +0000 UTC m=+0.060572016 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:57:28 np0005592767 nova_compute[182623]: 2026-01-22 22:57:28.013 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:28 np0005592767 nova_compute[182623]: 2026-01-22 22:57:28.435 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:32 np0005592767 podman[240966]: 2026-01-22 22:57:32.159977193 +0000 UTC m=+0.075043565 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6)
Jan 22 17:57:32 np0005592767 podman[240965]: 2026-01-22 22:57:32.238728853 +0000 UTC m=+0.159976020 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 17:57:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:32Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:8d:b3 10.100.0.13
Jan 22 17:57:32 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:32Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:8d:b3 10.100.0.13
Jan 22 17:57:33 np0005592767 nova_compute[182623]: 2026-01-22 22:57:33.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:33 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:33.359 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:33 np0005592767 nova_compute[182623]: 2026-01-22 22:57:33.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:35 np0005592767 nova_compute[182623]: 2026-01-22 22:57:35.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:36 np0005592767 nova_compute[182623]: 2026-01-22 22:57:36.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:36 np0005592767 nova_compute[182623]: 2026-01-22 22:57:36.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:57:36 np0005592767 nova_compute[182623]: 2026-01-22 22:57:36.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:57:37 np0005592767 systemd[1]: Starting dnf makecache...
Jan 22 17:57:37 np0005592767 dnf[241011]: Metadata cache refreshed recently.
Jan 22 17:57:37 np0005592767 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 17:57:37 np0005592767 systemd[1]: Finished dnf makecache.
Jan 22 17:57:38 np0005592767 nova_compute[182623]: 2026-01-22 22:57:38.023 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:38 np0005592767 nova_compute[182623]: 2026-01-22 22:57:38.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:39 np0005592767 nova_compute[182623]: 2026-01-22 22:57:39.322 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:39 np0005592767 nova_compute[182623]: 2026-01-22 22:57:39.323 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:39 np0005592767 nova_compute[182623]: 2026-01-22 22:57:39.324 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:57:39 np0005592767 nova_compute[182623]: 2026-01-22 22:57:39.324 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 625dab51-0d70-4c53-9794-741a3c4ccfc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:57:40 np0005592767 podman[241012]: 2026-01-22 22:57:40.140091003 +0000 UTC m=+0.060639678 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 17:57:40 np0005592767 podman[241013]: 2026-01-22 22:57:40.146026841 +0000 UTC m=+0.057428667 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:57:42 np0005592767 nova_compute[182623]: 2026-01-22 22:57:42.216 182627 DEBUG nova.compute.manager [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:42 np0005592767 nova_compute[182623]: 2026-01-22 22:57:42.216 182627 DEBUG nova.compute.manager [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing instance network info cache due to event network-changed-ad58676d-c707-42ac-95cb-bafdf2aa7e4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:57:42 np0005592767 nova_compute[182623]: 2026-01-22 22:57:42.217 182627 DEBUG oslo_concurrency.lockutils [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:57:43 np0005592767 nova_compute[182623]: 2026-01-22 22:57:43.026 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:43 np0005592767 nova_compute[182623]: 2026-01-22 22:57:43.442 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.320 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.321 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.322 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.323 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.323 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.353 182627 INFO nova.compute.manager [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Terminating instance#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.367 182627 DEBUG nova.compute.manager [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:57:45 np0005592767 kernel: tapad58676d-c7 (unregistering): left promiscuous mode
Jan 22 17:57:45 np0005592767 NetworkManager[54973]: <info>  [1769122665.3913] device (tapad58676d-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:57:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:45Z|00793|binding|INFO|Releasing lport ad58676d-c707-42ac-95cb-bafdf2aa7e4b from this chassis (sb_readonly=0)
Jan 22 17:57:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:45Z|00794|binding|INFO|Setting lport ad58676d-c707-42ac-95cb-bafdf2aa7e4b down in Southbound
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.401 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:45 np0005592767 ovn_controller[94769]: 2026-01-22T22:57:45Z|00795|binding|INFO|Removing iface tapad58676d-c7 ovn-installed in OVS
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.403 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.422 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:45 np0005592767 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 22 17:57:45 np0005592767 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000b7.scope: Consumed 12.967s CPU time.
Jan 22 17:57:45 np0005592767 systemd-machined[153912]: Machine qemu-94-instance-000000b7 terminated.
Jan 22 17:57:45 np0005592767 podman[241053]: 2026-01-22 22:57:45.477157045 +0000 UTC m=+0.058621340 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.635 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.689 182627 INFO nova.virt.libvirt.driver [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Instance destroyed successfully.#033[00m
Jan 22 17:57:45 np0005592767 nova_compute[182623]: 2026-01-22 22:57:45.690 182627 DEBUG nova.objects.instance [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lazy-loading 'resources' on Instance uuid 625dab51-0d70-4c53-9794-741a3c4ccfc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:57:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:46.881 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:8d:b3 10.100.0.13 2001:db8::f816:3eff:fe66:8db3'], port_security=['fa:16:3e:66:8d:b3 10.100.0.13 2001:db8::f816:3eff:fe66:8db3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe66:8db3/64', 'neutron:device_id': '625dab51-0d70-4c53-9794-741a3c4ccfc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61f6867826994602937cf08774d215cf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ed139db-5528-4ba0-9d69-09cd70ed61c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a169643-63c0-4f43-aa55-2402edf1efd7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=ad58676d-c707-42ac-95cb-bafdf2aa7e4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:57:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:46.883 104135 INFO neutron.agent.ovn.metadata.agent [-] Port ad58676d-c707-42ac-95cb-bafdf2aa7e4b in datapath d8d77c31-420b-47d9-87ac-6c37fe7e216d unbound from our chassis#033[00m
Jan 22 17:57:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:46.886 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8d77c31-420b-47d9-87ac-6c37fe7e216d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:57:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:46.887 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[661138b7-d18c-4b16-b210-ed7874c32eb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:46 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:46.888 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d namespace which is not needed anymore#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.914 182627 DEBUG nova.virt.libvirt.vif [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1642431739',display_name='tempest-TestGettingAddress-server-1642431739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1642431739',id=183,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKHZ1KMZo+qWxA9GoBPq33zivQ9decEOfNbceKLZHuHOqsDs/tyA3D4sE3L5JZ1B6KO/xDwF+p7p9iYpYMQYQwF7tVFf0iZITIwPPCAfwIzY9rt9qx275gQpD1U0a/ULPw==',key_name='tempest-TestGettingAddress-1399143294',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:57:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61f6867826994602937cf08774d215cf',ramdisk_id='',reservation_id='r-4xb9s2cl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1431418722',owner_user_name='tempest-TestGettingAddress-1431418722-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:57:19Z,user_data=None,user_id='17723e69e2af4d3d9c5837bae2a0ad5f',uuid=625dab51-0d70-4c53-9794-741a3c4ccfc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.914 182627 DEBUG nova.network.os_vif_util [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converting VIF {"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.915 182627 DEBUG nova.network.os_vif_util [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.916 182627 DEBUG os_vif [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.919 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.920 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.921 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad58676d-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.922 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.926 182627 INFO os_vif [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:8d:b3,bridge_name='br-int',has_traffic_filtering=True,id=ad58676d-c707-42ac-95cb-bafdf2aa7e4b,network=Network(d8d77c31-420b-47d9-87ac-6c37fe7e216d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapad58676d-c7')#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.926 182627 INFO nova.virt.libvirt.driver [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Deleting instance files /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0_del#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.927 182627 INFO nova.virt.libvirt.driver [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Deletion of /var/lib/nova/instances/625dab51-0d70-4c53-9794-741a3c4ccfc0_del complete#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.963 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.963 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.964 182627 DEBUG oslo_concurrency.lockutils [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.965 182627 DEBUG nova.network.neutron [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Refreshing network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.969 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.971 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.971 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.972 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:46 np0005592767 nova_compute[182623]: 2026-01-22 22:57:46.972 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.014 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.014 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.018 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.019 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.032 182627 INFO nova.compute.manager [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Took 1.66 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.032 182627 DEBUG oslo.service.loopingcall [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.033 182627 DEBUG nova.compute.manager [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.033 182627 DEBUG nova.network.neutron [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:57:47 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [NOTICE]   (240920) : haproxy version is 2.8.14-c23fe91
Jan 22 17:57:47 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [NOTICE]   (240920) : path to executable is /usr/sbin/haproxy
Jan 22 17:57:47 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [WARNING]  (240920) : Exiting Master process...
Jan 22 17:57:47 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [ALERT]    (240920) : Current worker (240922) exited with code 143 (Terminated)
Jan 22 17:57:47 np0005592767 neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d[240916]: [WARNING]  (240920) : All workers exited. Exiting... (0)
Jan 22 17:57:47 np0005592767 systemd[1]: libpod-e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2.scope: Deactivated successfully.
Jan 22 17:57:47 np0005592767 podman[241118]: 2026-01-22 22:57:47.078317996 +0000 UTC m=+0.073720738 container died e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:57:47 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2-userdata-shm.mount: Deactivated successfully.
Jan 22 17:57:47 np0005592767 systemd[1]: var-lib-containers-storage-overlay-7dd0a3c14dc086bd27d5f113689517b9dcaece28d15841cc068c7f8ee16eb003-merged.mount: Deactivated successfully.
Jan 22 17:57:47 np0005592767 podman[241118]: 2026-01-22 22:57:47.114759348 +0000 UTC m=+0.110162070 container cleanup e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:57:47 np0005592767 systemd[1]: libpod-conmon-e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2.scope: Deactivated successfully.
Jan 22 17:57:47 np0005592767 podman[241148]: 2026-01-22 22:57:47.191527972 +0000 UTC m=+0.051136739 container remove e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.196 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a59bad-808b-4756-9262-627f5a59b715]: (4, ('Thu Jan 22 10:57:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d (e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2)\ne85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2\nThu Jan 22 10:57:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d (e85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2)\ne85724a5c80ce744d1f46ff0b9e7e62e8e18652446b8d1c0bad803bfd228d8a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.200 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b27a966e-a9b0-43b9-aa43-7dd489fa3377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.202 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8d77c31-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.205 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:47 np0005592767 kernel: tapd8d77c31-40: left promiscuous mode
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.216 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.217 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.220 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea87df9-cd79-42f7-bb9b-f1d2c967b210]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.243 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2ac2fc-3a2b-467a-b21d-0f193260e90a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.244 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.245 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5685MB free_disk=73.05112075805664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.245 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:47 np0005592767 nova_compute[182623]: 2026-01-22 22:57:47.245 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.245 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc11265-5082-4807-a975-45d5a97eb982]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.258 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b153a70f-9212-4ff0-8603-9e9ebd4a7054]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623554, 'reachable_time': 19860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241164, 'error': None, 'target': 'ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.260 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8d77c31-420b-47d9-87ac-6c37fe7e216d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:57:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:57:47.261 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcbe8de-9aec-497c-838b-0f54f079cd80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:57:47 np0005592767 systemd[1]: run-netns-ovnmeta\x2dd8d77c31\x2d420b\x2d47d9\x2d87ac\x2d6c37fe7e216d.mount: Deactivated successfully.
Jan 22 17:57:48 np0005592767 nova_compute[182623]: 2026-01-22 22:57:48.445 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.748 182627 DEBUG nova.compute.manager [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-unplugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.748 182627 DEBUG oslo_concurrency.lockutils [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.748 182627 DEBUG oslo_concurrency.lockutils [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.749 182627 DEBUG oslo_concurrency.lockutils [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.749 182627 DEBUG nova.compute.manager [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] No waiting events found dispatching network-vif-unplugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.749 182627 DEBUG nova.compute.manager [req-ac09e9b1-2361-4e9d-9cc4-8a2a8ae1e857 req-58e30613-b2f3-4bb4-b995-cd935d0dc992 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-unplugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.820 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 625dab51-0d70-4c53-9794-741a3c4ccfc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.820 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.820 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.901 182627 DEBUG nova.network.neutron [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.913 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.933 182627 INFO nova.compute.manager [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Took 2.90 seconds to deallocate network for instance.#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.934 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.977 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:57:49 np0005592767 nova_compute[182623]: 2026-01-22 22:57:49.978 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.771 182627 DEBUG nova.network.neutron [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updated VIF entry in instance network info cache for port ad58676d-c707-42ac-95cb-bafdf2aa7e4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.772 182627 DEBUG nova.network.neutron [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Updating instance_info_cache with network_info: [{"id": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "address": "fa:16:3e:66:8d:b3", "network": {"id": "d8d77c31-420b-47d9-87ac-6c37fe7e216d", "bridge": "br-int", "label": "tempest-network-smoke--711854243", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe66:8db3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61f6867826994602937cf08774d215cf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapad58676d-c7", "ovs_interfaceid": "ad58676d-c707-42ac-95cb-bafdf2aa7e4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.902 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.903 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.904 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:57:51 np0005592767 nova_compute[182623]: 2026-01-22 22:57:51.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.757 182627 DEBUG nova.compute.manager [req-9b68fe0e-af3d-4118-969c-07743f15a3c0 req-933d6fe6-132c-459b-bc89-1c443afaa415 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-deleted-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.810 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.811 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.848 182627 DEBUG oslo_concurrency.lockutils [req-04752cae-46b0-46b2-b0a9-1631e36de62f req-2c9011bf-24ef-4932-a23f-2d4a6128cf06 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-625dab51-0d70-4c53-9794-741a3c4ccfc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.918 182627 DEBUG nova.compute.provider_tree [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.949 182627 DEBUG nova.scheduler.client.report [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:57:52 np0005592767 nova_compute[182623]: 2026-01-22 22:57:52.978 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.011 182627 DEBUG nova.compute.manager [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.012 182627 DEBUG oslo_concurrency.lockutils [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.012 182627 DEBUG oslo_concurrency.lockutils [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.013 182627 DEBUG oslo_concurrency.lockutils [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.013 182627 DEBUG nova.compute.manager [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] No waiting events found dispatching network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.014 182627 WARNING nova.compute.manager [req-b96b28fc-f7eb-484c-bdf2-68d1c16be5be req-3572f21f-fdde-4ed8-b082-13c4d54260d7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Received unexpected event network-vif-plugged-ad58676d-c707-42ac-95cb-bafdf2aa7e4b for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.036 182627 INFO nova.scheduler.client.report [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Deleted allocations for instance 625dab51-0d70-4c53-9794-741a3c4ccfc0#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.122 182627 DEBUG oslo_concurrency.lockutils [None req-c0a7755d-891a-4aa9-afed-9c2abe0f5c1d 17723e69e2af4d3d9c5837bae2a0ad5f 61f6867826994602937cf08774d215cf - - default default] Lock "625dab51-0d70-4c53-9794-741a3c4ccfc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.212 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:53 np0005592767 nova_compute[182623]: 2026-01-22 22:57:53.499 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:56 np0005592767 nova_compute[182623]: 2026-01-22 22:57:56.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:58 np0005592767 podman[241167]: 2026-01-22 22:57:58.14244268 +0000 UTC m=+0.058899039 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 17:57:58 np0005592767 nova_compute[182623]: 2026-01-22 22:57:58.501 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:57:58 np0005592767 nova_compute[182623]: 2026-01-22 22:57:58.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:57:58 np0005592767 nova_compute[182623]: 2026-01-22 22:57:58.923 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:00 np0005592767 nova_compute[182623]: 2026-01-22 22:58:00.686 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122665.6853914, 625dab51-0d70-4c53-9794-741a3c4ccfc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:58:00 np0005592767 nova_compute[182623]: 2026-01-22 22:58:00.687 182627 INFO nova.compute.manager [-] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:58:00 np0005592767 nova_compute[182623]: 2026-01-22 22:58:00.728 182627 DEBUG nova.compute.manager [None req-bddd1934-f85c-4d46-854c-c2505491db62 - - - - - -] [instance: 625dab51-0d70-4c53-9794-741a3c4ccfc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:58:01 np0005592767 nova_compute[182623]: 2026-01-22 22:58:01.928 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:03 np0005592767 podman[241189]: 2026-01-22 22:58:03.161449237 +0000 UTC m=+0.072729470 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vendor=Red Hat, Inc.)
Jan 22 17:58:03 np0005592767 podman[241188]: 2026-01-22 22:58:03.183282806 +0000 UTC m=+0.107843905 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:58:03 np0005592767 nova_compute[182623]: 2026-01-22 22:58:03.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:06 np0005592767 nova_compute[182623]: 2026-01-22 22:58:06.929 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 22:58:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 17:58:08 np0005592767 nova_compute[182623]: 2026-01-22 22:58:08.506 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:08 np0005592767 nova_compute[182623]: 2026-01-22 22:58:08.958 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.033 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.034 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.054 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.180 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.180 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.188 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.188 182627 INFO nova.compute.claims [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.507 182627 DEBUG nova.compute.provider_tree [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.522 182627 DEBUG nova.scheduler.client.report [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.561 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.562 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.672 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.672 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.700 182627 INFO nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.723 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.880 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.881 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.882 182627 INFO nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Creating image(s)#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.882 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.882 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.883 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.899 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.959 182627 DEBUG nova.policy [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2dde2bf09b047c99729d9bb2a52f210', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '950f29a9a7a847c59baa1fbd5f79a146', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.963 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.964 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.964 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:09 np0005592767 nova_compute[182623]: 2026-01-22 22:58:09.978 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.030 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.031 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.065 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.066 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.067 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.128 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.129 182627 DEBUG nova.virt.disk.api [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Checking if we can resize image /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.130 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.200 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.201 182627 DEBUG nova.virt.disk.api [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Cannot resize image /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.202 182627 DEBUG nova.objects.instance [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lazy-loading 'migration_context' on Instance uuid ba4a2a05-d133-44b5-9f11-61443b261a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.227 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.228 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Ensure instance console log exists: /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.229 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.229 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:10 np0005592767 nova_compute[182623]: 2026-01-22 22:58:10.229 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:11 np0005592767 nova_compute[182623]: 2026-01-22 22:58:11.051 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Successfully created port: cd57b878-8083-4927-971e-fc15d5c075a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:58:11 np0005592767 podman[241251]: 2026-01-22 22:58:11.130992499 +0000 UTC m=+0.047732082 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:58:11 np0005592767 podman[241252]: 2026-01-22 22:58:11.140148758 +0000 UTC m=+0.050792469 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 17:58:11 np0005592767 nova_compute[182623]: 2026-01-22 22:58:11.932 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:12.129 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:12.129 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:12.129 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.289 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Successfully updated port: cd57b878-8083-4927-971e-fc15d5c075a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.311 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.312 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquired lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.312 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.518 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.575 182627 DEBUG nova.compute.manager [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-changed-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.575 182627 DEBUG nova.compute.manager [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Refreshing instance network info cache due to event network-changed-cd57b878-8083-4927-971e-fc15d5c075a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:58:13 np0005592767 nova_compute[182623]: 2026-01-22 22:58:13.576 182627 DEBUG oslo_concurrency.lockutils [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:58:14 np0005592767 nova_compute[182623]: 2026-01-22 22:58:14.384 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:58:15 np0005592767 podman[241295]: 2026-01-22 22:58:15.85174228 +0000 UTC m=+0.045842218 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 17:58:16 np0005592767 nova_compute[182623]: 2026-01-22 22:58:16.934 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.571 182627 DEBUG nova.network.neutron [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updating instance_info_cache with network_info: [{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.634 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Releasing lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.635 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Instance network_info: |[{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.635 182627 DEBUG oslo_concurrency.lockutils [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.635 182627 DEBUG nova.network.neutron [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Refreshing network info cache for port cd57b878-8083-4927-971e-fc15d5c075a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.637 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Start _get_guest_xml network_info=[{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.642 182627 WARNING nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.647 182627 DEBUG nova.virt.libvirt.host [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.648 182627 DEBUG nova.virt.libvirt.host [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.651 182627 DEBUG nova.virt.libvirt.host [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.651 182627 DEBUG nova.virt.libvirt.host [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.653 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.653 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.653 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.653 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.654 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.654 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.654 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.654 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.654 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.655 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.655 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.655 182627 DEBUG nova.virt.hardware [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.658 182627 DEBUG nova.virt.libvirt.vif [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1757678018',display_name='tempest-TestServerBasicOps-server-1757678018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1757678018',id=184,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF8B8xeqnK4SME7hEbNfDqioKZ1v/O8y8X2Ow1+GT31owh5NHteR9gYrBXMkRBrDHqwIDshBEtKIaXVXzmpIZVxMhLJqQcbm5XgG2nPWpAPdbNk86Ii3btVIKf/JaF76xg==',key_name='tempest-TestServerBasicOps-758584346',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='950f29a9a7a847c59baa1fbd5f79a146',ramdisk_id='',reservation_id='r-ruopa8i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1651057148',owner_user_name='tempest-TestServerBasicOps-1651057148-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:58:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f2dde2bf09b047c99729d9bb2a52f210',uuid=ba4a2a05-d133-44b5-9f11-61443b261a25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.659 182627 DEBUG nova.network.os_vif_util [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converting VIF {"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.659 182627 DEBUG nova.network.os_vif_util [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.660 182627 DEBUG nova.objects.instance [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lazy-loading 'pci_devices' on Instance uuid ba4a2a05-d133-44b5-9f11-61443b261a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.673 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <uuid>ba4a2a05-d133-44b5-9f11-61443b261a25</uuid>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <name>instance-000000b8</name>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestServerBasicOps-server-1757678018</nova:name>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:58:17</nova:creationTime>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:user uuid="f2dde2bf09b047c99729d9bb2a52f210">tempest-TestServerBasicOps-1651057148-project-member</nova:user>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:project uuid="950f29a9a7a847c59baa1fbd5f79a146">tempest-TestServerBasicOps-1651057148</nova:project>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        <nova:port uuid="cd57b878-8083-4927-971e-fc15d5c075a4">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="serial">ba4a2a05-d133-44b5-9f11-61443b261a25</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="uuid">ba4a2a05-d133-44b5-9f11-61443b261a25</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.config"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:0b:b8:2e"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <target dev="tapcd57b878-80"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/console.log" append="off"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:58:17 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:58:17 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:58:17 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:58:17 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.675 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Preparing to wait for external event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.675 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.676 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.676 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.676 182627 DEBUG nova.virt.libvirt.vif [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1757678018',display_name='tempest-TestServerBasicOps-server-1757678018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1757678018',id=184,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF8B8xeqnK4SME7hEbNfDqioKZ1v/O8y8X2Ow1+GT31owh5NHteR9gYrBXMkRBrDHqwIDshBEtKIaXVXzmpIZVxMhLJqQcbm5XgG2nPWpAPdbNk86Ii3btVIKf/JaF76xg==',key_name='tempest-TestServerBasicOps-758584346',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='950f29a9a7a847c59baa1fbd5f79a146',ramdisk_id='',reservation_id='r-ruopa8i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1651057148',owner_user_name='tempest-TestServerBasicOps-1651057148-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:58:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f2dde2bf09b047c99729d9bb2a52f210',uuid=ba4a2a05-d133-44b5-9f11-61443b261a25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.677 182627 DEBUG nova.network.os_vif_util [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converting VIF {"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.677 182627 DEBUG nova.network.os_vif_util [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.677 182627 DEBUG os_vif [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.678 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.678 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.679 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.681 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.681 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd57b878-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.682 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd57b878-80, col_values=(('external_ids', {'iface-id': 'cd57b878-8083-4927-971e-fc15d5c075a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:b8:2e', 'vm-uuid': 'ba4a2a05-d133-44b5-9f11-61443b261a25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:17 np0005592767 NetworkManager[54973]: <info>  [1769122697.6846] manager: (tapcd57b878-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.688 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.692 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.693 182627 INFO os_vif [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80')#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.752 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.752 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.752 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] No VIF found with MAC fa:16:3e:0b:b8:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:58:17 np0005592767 nova_compute[182623]: 2026-01-22 22:58:17.753 182627 INFO nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Using config drive#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.229 182627 INFO nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Creating config drive at /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.config#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.235 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9w2w1qet execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.365 182627 DEBUG oslo_concurrency.processutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9w2w1qet" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:18 np0005592767 kernel: tapcd57b878-80: entered promiscuous mode
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.4246] manager: (tapcd57b878-80): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.425 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:18Z|00796|binding|INFO|Claiming lport cd57b878-8083-4927-971e-fc15d5c075a4 for this chassis.
Jan 22 17:58:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:18Z|00797|binding|INFO|cd57b878-8083-4927-971e-fc15d5c075a4: Claiming fa:16:3e:0b:b8:2e 10.100.0.3
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.430 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.437 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:b8:2e 10.100.0.3'], port_security=['fa:16:3e:0b:b8:2e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ba4a2a05-d133-44b5-9f11-61443b261a25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-483bff69-5ba3-437f-8350-a3a773250d59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '950f29a9a7a847c59baa1fbd5f79a146', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3f1d0353-d38d-44d3-8fc8-97893d3fcb08 cd49c4a2-595b-4f39-962c-8460837545ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=416def01-18e7-47cf-a995-dc83db0261a9, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cd57b878-8083-4927-971e-fc15d5c075a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.437 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cd57b878-8083-4927-971e-fc15d5c075a4 in datapath 483bff69-5ba3-437f-8350-a3a773250d59 bound to our chassis#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.439 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 483bff69-5ba3-437f-8350-a3a773250d59#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.449 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0cae7a59-c35b-49c9-b317-c8ae3abe22db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.450 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap483bff69-51 in ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.452 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap483bff69-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.452 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d070fc-4b4b-4a2a-aa47-f14061f4e394]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.453 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e5323e33-005c-4346-8618-f69a59b2726f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 systemd-udevd[241339]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:58:18 np0005592767 systemd-machined[153912]: New machine qemu-95-instance-000000b8.
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.463 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7593a3-1180-4a84-bb8d-1f6a193d619c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.4664] device (tapcd57b878-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.4671] device (tapcd57b878-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.478 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf6476b-747f-4997-a224-d60abaec5ec4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.479 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 systemd[1]: Started Virtual Machine qemu-95-instance-000000b8.
Jan 22 17:58:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:18Z|00798|binding|INFO|Setting lport cd57b878-8083-4927-971e-fc15d5c075a4 ovn-installed in OVS
Jan 22 17:58:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:18Z|00799|binding|INFO|Setting lport cd57b878-8083-4927-971e-fc15d5c075a4 up in Southbound
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.505 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0c38ed-1f83-42e7-a245-c6c2bf213296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.5098] manager: (tap483bff69-50): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.509 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[4de16452-0624-4439-ac60-9d40459832b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.519 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.545 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[c364ab0d-51c0-463e-9e62-abdacb2bd4e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.548 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[6468fb2a-4ee7-4c54-a8d1-66b57ae9c6bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.5707] device (tap483bff69-50): carrier: link connected
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.577 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[4a553f97-8033-4022-927a-f33f9b24f9fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.592 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[60fdc565-87b1-48dd-ad8a-c6cb1ade29d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap483bff69-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:35:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629514, 'reachable_time': 30836, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241372, 'error': None, 'target': 'ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.615 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1fa543-cfa2-4b36-9e22-ca1eface2ef1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:354c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 629514, 'tstamp': 629514}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241373, 'error': None, 'target': 'ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.634 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d1ae33-3e20-4b57-b949-c82935b3cbe7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap483bff69-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:35:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 242], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629514, 'reachable_time': 30836, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241374, 'error': None, 'target': 'ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.670 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18fb5868-6918-486d-b852-d5045955c6f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.750 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[552eedb4-4be0-4b93-bacf-bb008d45a032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.755 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap483bff69-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.755 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.756 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap483bff69-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.758 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 NetworkManager[54973]: <info>  [1769122698.7593] manager: (tap483bff69-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/383)
Jan 22 17:58:18 np0005592767 kernel: tap483bff69-50: entered promiscuous mode
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.762 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.763 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap483bff69-50, col_values=(('external_ids', {'iface-id': '74192358-c218-4e58-a539-e3597faa6ed0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.765 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.767 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.767 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/483bff69-5ba3-437f-8350-a3a773250d59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/483bff69-5ba3-437f-8350-a3a773250d59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:58:18 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:18Z|00800|binding|INFO|Releasing lport 74192358-c218-4e58-a539-e3597faa6ed0 from this chassis (sb_readonly=0)
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.769 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[89c57908-dd91-45ac-9eeb-8a5addc1dcd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.770 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-483bff69-5ba3-437f-8350-a3a773250d59
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/483bff69-5ba3-437f-8350-a3a773250d59.pid.haproxy
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 483bff69-5ba3-437f-8350-a3a773250d59
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:58:18 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:18.771 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59', 'env', 'PROCESS_TAG=haproxy-483bff69-5ba3-437f-8350-a3a773250d59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/483bff69-5ba3-437f-8350-a3a773250d59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.902 182627 DEBUG nova.compute.manager [req-668ef562-fa17-4193-9fd9-789bb88a7203 req-06820999-61a0-43a8-9fa4-9f89a15112fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.903 182627 DEBUG oslo_concurrency.lockutils [req-668ef562-fa17-4193-9fd9-789bb88a7203 req-06820999-61a0-43a8-9fa4-9f89a15112fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.904 182627 DEBUG oslo_concurrency.lockutils [req-668ef562-fa17-4193-9fd9-789bb88a7203 req-06820999-61a0-43a8-9fa4-9f89a15112fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.904 182627 DEBUG oslo_concurrency.lockutils [req-668ef562-fa17-4193-9fd9-789bb88a7203 req-06820999-61a0-43a8-9fa4-9f89a15112fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:18 np0005592767 nova_compute[182623]: 2026-01-22 22:58:18.904 182627 DEBUG nova.compute.manager [req-668ef562-fa17-4193-9fd9-789bb88a7203 req-06820999-61a0-43a8-9fa4-9f89a15112fc 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Processing event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:58:19 np0005592767 podman[241406]: 2026-01-22 22:58:19.134857881 +0000 UTC m=+0.052481127 container create 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:58:19 np0005592767 systemd[1]: Started libpod-conmon-72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60.scope.
Jan 22 17:58:19 np0005592767 podman[241406]: 2026-01-22 22:58:19.104537982 +0000 UTC m=+0.022161228 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:58:19 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:58:19 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/294a995ab609a8f973ed2fa267f337057462eda9ecfb8bd4e807a0bb5e3712d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:58:19 np0005592767 podman[241406]: 2026-01-22 22:58:19.228173842 +0000 UTC m=+0.145797138 container init 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:58:19 np0005592767 podman[241406]: 2026-01-22 22:58:19.238870835 +0000 UTC m=+0.156494091 container start 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 17:58:19 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [NOTICE]   (241425) : New worker (241427) forked
Jan 22 17:58:19 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [NOTICE]   (241425) : Loading success.
Jan 22 17:58:19 np0005592767 nova_compute[182623]: 2026-01-22 22:58:19.723 182627 DEBUG nova.network.neutron [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updated VIF entry in instance network info cache for port cd57b878-8083-4927-971e-fc15d5c075a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:58:19 np0005592767 nova_compute[182623]: 2026-01-22 22:58:19.724 182627 DEBUG nova.network.neutron [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updating instance_info_cache with network_info: [{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:58:19 np0005592767 nova_compute[182623]: 2026-01-22 22:58:19.750 182627 DEBUG oslo_concurrency.lockutils [req-2c4c9ab9-20eb-4ec0-907e-1a94895a76d3 req-402d6549-bf21-4a24-b8ea-0d23b727daab 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.018 182627 DEBUG nova.compute.manager [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.019 182627 DEBUG oslo_concurrency.lockutils [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.020 182627 DEBUG oslo_concurrency.lockutils [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.020 182627 DEBUG oslo_concurrency.lockutils [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.020 182627 DEBUG nova.compute.manager [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] No waiting events found dispatching network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.021 182627 WARNING nova.compute.manager [req-e31af80b-bc93-4210-b431-d478b679b169 req-b53ce8ca-3bed-4172-ae3a-904adbd60851 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received unexpected event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 for instance with vm_state building and task_state spawning.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.258 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.259 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122701.2584891, ba4a2a05-d133-44b5-9f11-61443b261a25 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.260 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] VM Started (Lifecycle Event)#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.267 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.272 182627 INFO nova.virt.libvirt.driver [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Instance spawned successfully.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.273 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.287 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.293 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.296 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.297 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.297 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.298 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.298 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.298 182627 DEBUG nova.virt.libvirt.driver [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.326 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.326 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122701.2585948, ba4a2a05-d133-44b5-9f11-61443b261a25 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.327 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.348 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.351 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122701.2624395, ba4a2a05-d133-44b5-9f11-61443b261a25 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.351 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.378 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.379 182627 INFO nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Took 11.50 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.379 182627 DEBUG nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.382 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.419 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.476 182627 INFO nova.compute.manager [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Took 12.35 seconds to build instance.#033[00m
Jan 22 17:58:21 np0005592767 nova_compute[182623]: 2026-01-22 22:58:21.501 182627 DEBUG oslo_concurrency.lockutils [None req-0810e9b3-4e7c-4c37-900e-42572fd61cb8 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:22 np0005592767 nova_compute[182623]: 2026-01-22 22:58:22.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:23 np0005592767 nova_compute[182623]: 2026-01-22 22:58:23.521 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:24 np0005592767 NetworkManager[54973]: <info>  [1769122704.9419] manager: (patch-br-int-to-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Jan 22 17:58:24 np0005592767 nova_compute[182623]: 2026-01-22 22:58:24.939 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:24 np0005592767 NetworkManager[54973]: <info>  [1769122704.9429] manager: (patch-provnet-b0b866bd-536a-4168-ad94-2c51b772a5c2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 22 17:58:25 np0005592767 nova_compute[182623]: 2026-01-22 22:58:25.005 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:25 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:25Z|00801|binding|INFO|Releasing lport 74192358-c218-4e58-a539-e3597faa6ed0 from this chassis (sb_readonly=0)
Jan 22 17:58:25 np0005592767 nova_compute[182623]: 2026-01-22 22:58:25.018 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:26 np0005592767 nova_compute[182623]: 2026-01-22 22:58:26.190 182627 DEBUG nova.compute.manager [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-changed-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:26 np0005592767 nova_compute[182623]: 2026-01-22 22:58:26.191 182627 DEBUG nova.compute.manager [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Refreshing instance network info cache due to event network-changed-cd57b878-8083-4927-971e-fc15d5c075a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:58:26 np0005592767 nova_compute[182623]: 2026-01-22 22:58:26.191 182627 DEBUG oslo_concurrency.lockutils [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:58:26 np0005592767 nova_compute[182623]: 2026-01-22 22:58:26.192 182627 DEBUG oslo_concurrency.lockutils [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:58:26 np0005592767 nova_compute[182623]: 2026-01-22 22:58:26.192 182627 DEBUG nova.network.neutron [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Refreshing network info cache for port cd57b878-8083-4927-971e-fc15d5c075a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:58:27 np0005592767 nova_compute[182623]: 2026-01-22 22:58:27.726 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:28 np0005592767 nova_compute[182623]: 2026-01-22 22:58:28.524 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:28 np0005592767 podman[241445]: 2026-01-22 22:58:28.893503145 +0000 UTC m=+0.064133077 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:58:29 np0005592767 nova_compute[182623]: 2026-01-22 22:58:29.415 182627 DEBUG nova.network.neutron [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updated VIF entry in instance network info cache for port cd57b878-8083-4927-971e-fc15d5c075a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:58:29 np0005592767 nova_compute[182623]: 2026-01-22 22:58:29.416 182627 DEBUG nova.network.neutron [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updating instance_info_cache with network_info: [{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:58:29 np0005592767 nova_compute[182623]: 2026-01-22 22:58:29.446 182627 DEBUG oslo_concurrency.lockutils [req-3eca2abc-25fb-4cc5-8e1a-1b0d41064ea6 req-8e4d2e2c-b9fa-4bd8-bf7b-44951cb3d2a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:58:32 np0005592767 nova_compute[182623]: 2026-01-22 22:58:32.728 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:33 np0005592767 nova_compute[182623]: 2026-01-22 22:58:33.541 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:34 np0005592767 podman[241483]: 2026-01-22 22:58:34.155161742 +0000 UTC m=+0.063216921 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6)
Jan 22 17:58:34 np0005592767 podman[241482]: 2026-01-22 22:58:34.165458824 +0000 UTC m=+0.083801034 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:58:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:34Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:b8:2e 10.100.0.3
Jan 22 17:58:34 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:34Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:b8:2e 10.100.0.3
Jan 22 17:58:37 np0005592767 nova_compute[182623]: 2026-01-22 22:58:37.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:37 np0005592767 nova_compute[182623]: 2026-01-22 22:58:37.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:38 np0005592767 nova_compute[182623]: 2026-01-22 22:58:38.543 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:38 np0005592767 nova_compute[182623]: 2026-01-22 22:58:38.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:38 np0005592767 nova_compute[182623]: 2026-01-22 22:58:38.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:58:38 np0005592767 nova_compute[182623]: 2026-01-22 22:58:38.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:58:39 np0005592767 nova_compute[182623]: 2026-01-22 22:58:39.379 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:58:39 np0005592767 nova_compute[182623]: 2026-01-22 22:58:39.379 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquired lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:58:39 np0005592767 nova_compute[182623]: 2026-01-22 22:58:39.379 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 22 17:58:39 np0005592767 nova_compute[182623]: 2026-01-22 22:58:39.380 182627 DEBUG nova.objects.instance [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ba4a2a05-d133-44b5-9f11-61443b261a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:58:40 np0005592767 nova_compute[182623]: 2026-01-22 22:58:40.929 182627 DEBUG nova.network.neutron [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updating instance_info_cache with network_info: [{"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:58:40 np0005592767 nova_compute[182623]: 2026-01-22 22:58:40.948 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Releasing lock "refresh_cache-ba4a2a05-d133-44b5-9f11-61443b261a25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:58:40 np0005592767 nova_compute[182623]: 2026-01-22 22:58:40.949 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.927 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.927 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:41 np0005592767 nova_compute[182623]: 2026-01-22 22:58:41.928 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.014 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:42 np0005592767 podman[241532]: 2026-01-22 22:58:42.035309113 +0000 UTC m=+0.056441639 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 17:58:42 np0005592767 podman[241531]: 2026-01-22 22:58:42.0383899 +0000 UTC m=+0.063959502 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.074 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.075 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.127 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.298 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.299 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5546MB free_disk=73.02222442626953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.299 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.300 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.535 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance ba4a2a05-d133-44b5-9f11-61443b261a25 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.535 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.535 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.571 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.586 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.607 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.607 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:42 np0005592767 nova_compute[182623]: 2026-01-22 22:58:42.735 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:43 np0005592767 nova_compute[182623]: 2026-01-22 22:58:43.547 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:43.776 104394 DEBUG eventlet.wsgi.server [-] (104394) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:43.778 104394 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: Accept: */*#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: Connection: close#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: Content-Type: text/plain#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: Host: 169.254.169.254#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: User-Agent: curl/7.84.0#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: X-Forwarded-For: 10.100.0.3#015
Jan 22 17:58:43 np0005592767 ovn_metadata_agent[104130]: X-Ovn-Network-Id: 483bff69-5ba3-437f-8350-a3a773250d59 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 22 17:58:44 np0005592767 nova_compute[182623]: 2026-01-22 22:58:44.607 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.565 104394 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.565 104394 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.7871585#033[00m
Jan 22 17:58:45 np0005592767 haproxy-metadata-proxy-483bff69-5ba3-437f-8350-a3a773250d59[241427]: 10.100.0.3:57866 [22/Jan/2026:22:58:43.775] listener listener/metadata 0/0/0/1789/1789 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.633 104394 DEBUG eventlet.wsgi.server [-] (104394) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.634 104394 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: Accept: */*#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: Connection: close#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: Content-Length: 100#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: Content-Type: application/x-www-form-urlencoded#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: Host: 169.254.169.254#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: User-Agent: curl/7.84.0#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: X-Forwarded-For: 10.100.0.3#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: X-Ovn-Network-Id: 483bff69-5ba3-437f-8350-a3a773250d59#015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: #015
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.847 104394 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 22 17:58:45 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:45.848 104394 INFO eventlet.wsgi.server [-] 10.100.0.3,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.2136364#033[00m
Jan 22 17:58:45 np0005592767 haproxy-metadata-proxy-483bff69-5ba3-437f-8350-a3a773250d59[241427]: 10.100.0.3:36084 [22/Jan/2026:22:58:45.633] listener listener/metadata 0/0/0/215/215 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Jan 22 17:58:45 np0005592767 nova_compute[182623]: 2026-01-22 22:58:45.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:46 np0005592767 podman[241578]: 2026-01-22 22:58:46.167130152 +0000 UTC m=+0.083678670 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.709 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.709 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.709 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.710 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.710 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.721 182627 INFO nova.compute.manager [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Terminating instance#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.732 182627 DEBUG nova.compute.manager [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.738 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:47 np0005592767 kernel: tapcd57b878-80 (unregistering): left promiscuous mode
Jan 22 17:58:47 np0005592767 NetworkManager[54973]: <info>  [1769122727.7524] device (tapcd57b878-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.758 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:47Z|00802|binding|INFO|Releasing lport cd57b878-8083-4927-971e-fc15d5c075a4 from this chassis (sb_readonly=0)
Jan 22 17:58:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:47Z|00803|binding|INFO|Setting lport cd57b878-8083-4927-971e-fc15d5c075a4 down in Southbound
Jan 22 17:58:47 np0005592767 ovn_controller[94769]: 2026-01-22T22:58:47Z|00804|binding|INFO|Removing iface tapcd57b878-80 ovn-installed in OVS
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.760 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:47.770 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:b8:2e 10.100.0.3'], port_security=['fa:16:3e:0b:b8:2e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ba4a2a05-d133-44b5-9f11-61443b261a25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-483bff69-5ba3-437f-8350-a3a773250d59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '950f29a9a7a847c59baa1fbd5f79a146', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3f1d0353-d38d-44d3-8fc8-97893d3fcb08 cd49c4a2-595b-4f39-962c-8460837545ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=416def01-18e7-47cf-a995-dc83db0261a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=cd57b878-8083-4927-971e-fc15d5c075a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:58:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:47.772 104135 INFO neutron.agent.ovn.metadata.agent [-] Port cd57b878-8083-4927-971e-fc15d5c075a4 in datapath 483bff69-5ba3-437f-8350-a3a773250d59 unbound from our chassis#033[00m
Jan 22 17:58:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:47.773 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 483bff69-5ba3-437f-8350-a3a773250d59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:58:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:47.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8b8247-3008-4b21-a8a0-f573832b2f7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:47.775 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59 namespace which is not needed anymore#033[00m
Jan 22 17:58:47 np0005592767 nova_compute[182623]: 2026-01-22 22:58:47.775 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:47 np0005592767 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Jan 22 17:58:47 np0005592767 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b8.scope: Consumed 15.668s CPU time.
Jan 22 17:58:47 np0005592767 systemd-machined[153912]: Machine qemu-95-instance-000000b8 terminated.
Jan 22 17:58:47 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [NOTICE]   (241425) : haproxy version is 2.8.14-c23fe91
Jan 22 17:58:47 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [NOTICE]   (241425) : path to executable is /usr/sbin/haproxy
Jan 22 17:58:47 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [WARNING]  (241425) : Exiting Master process...
Jan 22 17:58:47 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [ALERT]    (241425) : Current worker (241427) exited with code 143 (Terminated)
Jan 22 17:58:47 np0005592767 neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59[241421]: [WARNING]  (241425) : All workers exited. Exiting... (0)
Jan 22 17:58:47 np0005592767 systemd[1]: libpod-72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60.scope: Deactivated successfully.
Jan 22 17:58:47 np0005592767 podman[241626]: 2026-01-22 22:58:47.911356034 +0000 UTC m=+0.047144126 container died 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 17:58:47 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60-userdata-shm.mount: Deactivated successfully.
Jan 22 17:58:47 np0005592767 systemd[1]: var-lib-containers-storage-overlay-294a995ab609a8f973ed2fa267f337057462eda9ecfb8bd4e807a0bb5e3712d4-merged.mount: Deactivated successfully.
Jan 22 17:58:48 np0005592767 podman[241626]: 2026-01-22 22:58:47.966547347 +0000 UTC m=+0.102335459 container cleanup 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 systemd[1]: libpod-conmon-72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60.scope: Deactivated successfully.
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.042 182627 DEBUG nova.compute.manager [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-unplugged-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.043 182627 DEBUG oslo_concurrency.lockutils [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.043 182627 DEBUG oslo_concurrency.lockutils [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.043 182627 DEBUG oslo_concurrency.lockutils [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.044 182627 DEBUG nova.compute.manager [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] No waiting events found dispatching network-vif-unplugged-cd57b878-8083-4927-971e-fc15d5c075a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.044 182627 DEBUG nova.compute.manager [req-ca32a92c-8b64-4053-aa17-eec18fae4901 req-7476ecc5-09d7-46c2-a9cd-6d6d74c2af53 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-unplugged-cd57b878-8083-4927-971e-fc15d5c075a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 22 17:58:48 np0005592767 podman[241659]: 2026-01-22 22:58:48.050811092 +0000 UTC m=+0.056208092 container remove 72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.057 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f327eef2-a383-41d6-b855-0b3920531dee]: (4, ('Thu Jan 22 10:58:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59 (72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60)\n72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60\nThu Jan 22 10:58:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59 (72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60)\n72a48e037b0baeab1676d2975a8f452fafa5f9fb28b7cfb6b7d4b726a46bde60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.059 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[9118d42a-b1e4-485c-8a0c-973e98e47b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.061 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap483bff69-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.063 182627 INFO nova.virt.libvirt.driver [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Instance destroyed successfully.#033[00m
Jan 22 17:58:48 np0005592767 kernel: tap483bff69-50: left promiscuous mode
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.065 182627 DEBUG nova.objects.instance [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lazy-loading 'resources' on Instance uuid ba4a2a05-d133-44b5-9f11-61443b261a25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.079 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.082 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[18cb474e-d6f8-4ea3-811e-96e1ceb4ee28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.095 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[f91e8373-a164-49f7-a4c0-5fb136d88d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.097 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[ba342758-6307-49c2-85b7-c2f6a4b01ba4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.097 182627 DEBUG nova.virt.libvirt.vif [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:58:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1757678018',display_name='tempest-TestServerBasicOps-server-1757678018',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1757678018',id=184,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF8B8xeqnK4SME7hEbNfDqioKZ1v/O8y8X2Ow1+GT31owh5NHteR9gYrBXMkRBrDHqwIDshBEtKIaXVXzmpIZVxMhLJqQcbm5XgG2nPWpAPdbNk86Ii3btVIKf/JaF76xg==',key_name='tempest-TestServerBasicOps-758584346',keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:58:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='950f29a9a7a847c59baa1fbd5f79a146',ramdisk_id='',reservation_id='r-ruopa8i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1651057148',owner_user_name='tempest-TestServerBasicOps-1651057148-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:58:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f2dde2bf09b047c99729d9bb2a52f210',uuid=ba4a2a05-d133-44b5-9f11-61443b261a25,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.098 182627 DEBUG nova.network.os_vif_util [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converting VIF {"id": "cd57b878-8083-4927-971e-fc15d5c075a4", "address": "fa:16:3e:0b:b8:2e", "network": {"id": "483bff69-5ba3-437f-8350-a3a773250d59", "bridge": "br-int", "label": "tempest-TestServerBasicOps-708582305-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "950f29a9a7a847c59baa1fbd5f79a146", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd57b878-80", "ovs_interfaceid": "cd57b878-8083-4927-971e-fc15d5c075a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.098 182627 DEBUG nova.network.os_vif_util [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.099 182627 DEBUG os_vif [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.100 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd57b878-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.102 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.105 182627 INFO os_vif [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:b8:2e,bridge_name='br-int',has_traffic_filtering=True,id=cd57b878-8083-4927-971e-fc15d5c075a4,network=Network(483bff69-5ba3-437f-8350-a3a773250d59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd57b878-80')#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.106 182627 INFO nova.virt.libvirt.driver [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Deleting instance files /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25_del#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.107 182627 INFO nova.virt.libvirt.driver [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Deletion of /var/lib/nova/instances/ba4a2a05-d133-44b5-9f11-61443b261a25_del complete#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.121 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f28ebe-c619-4612-8339-939c4e3f505c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629507, 'reachable_time': 20764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241689, 'error': None, 'target': 'ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.124 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-483bff69-5ba3-437f-8350-a3a773250d59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:58:48 np0005592767 systemd[1]: run-netns-ovnmeta\x2d483bff69\x2d5ba3\x2d437f\x2d8350\x2da3a773250d59.mount: Deactivated successfully.
Jan 22 17:58:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:48.124 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c3eabd-14d8-4eca-9409-c0cf251add70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.186 182627 INFO nova.compute.manager [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.186 182627 DEBUG oslo.service.loopingcall [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.186 182627 DEBUG nova.compute.manager [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.187 182627 DEBUG nova.network.neutron [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:58:48 np0005592767 nova_compute[182623]: 2026-01-22 22:58:48.549 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:49.398 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:58:49 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:49.400 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.406 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.439 182627 DEBUG nova.network.neutron [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.470 182627 INFO nova.compute.manager [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Took 1.28 seconds to deallocate network for instance.#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.594 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.595 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.600 182627 DEBUG nova.compute.manager [req-5414b442-fc83-4dc0-9ef5-d6d5c5a0d366 req-a670a770-a4a4-4660-84d3-83f7c1256702 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-deleted-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.674 182627 DEBUG nova.compute.provider_tree [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.693 182627 DEBUG nova.scheduler.client.report [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.714 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.757 182627 INFO nova.scheduler.client.report [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Deleted allocations for instance ba4a2a05-d133-44b5-9f11-61443b261a25#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.824 182627 DEBUG oslo_concurrency.lockutils [None req-9ce60a32-2c6d-426f-91f0-2b113adae1c3 f2dde2bf09b047c99729d9bb2a52f210 950f29a9a7a847c59baa1fbd5f79a146 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:49 np0005592767 nova_compute[182623]: 2026-01-22 22:58:49.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.163 182627 DEBUG nova.compute.manager [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.163 182627 DEBUG oslo_concurrency.lockutils [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.164 182627 DEBUG oslo_concurrency.lockutils [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.165 182627 DEBUG oslo_concurrency.lockutils [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "ba4a2a05-d133-44b5-9f11-61443b261a25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.165 182627 DEBUG nova.compute.manager [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] No waiting events found dispatching network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:58:50 np0005592767 nova_compute[182623]: 2026-01-22 22:58:50.166 182627 WARNING nova.compute.manager [req-f91e74dd-0593-4ca1-9cbd-a21e5b49e2ef req-697b73cd-446a-467c-beea-cb0fed323a66 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Received unexpected event network-vif-plugged-cd57b878-8083-4927-971e-fc15d5c075a4 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:58:50 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:58:50.402 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:58:51 np0005592767 nova_compute[182623]: 2026-01-22 22:58:51.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:53 np0005592767 nova_compute[182623]: 2026-01-22 22:58:53.102 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:53 np0005592767 nova_compute[182623]: 2026-01-22 22:58:53.552 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:58 np0005592767 nova_compute[182623]: 2026-01-22 22:58:58.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:58 np0005592767 nova_compute[182623]: 2026-01-22 22:58:58.276 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:58 np0005592767 nova_compute[182623]: 2026-01-22 22:58:58.381 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:58 np0005592767 nova_compute[182623]: 2026-01-22 22:58:58.553 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:58:58 np0005592767 nova_compute[182623]: 2026-01-22 22:58:58.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:58:59 np0005592767 podman[241691]: 2026-01-22 22:58:59.160614881 +0000 UTC m=+0.079118971 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 17:59:03 np0005592767 nova_compute[182623]: 2026-01-22 22:59:03.062 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122728.0592613, ba4a2a05-d133-44b5-9f11-61443b261a25 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:59:03 np0005592767 nova_compute[182623]: 2026-01-22 22:59:03.062 182627 INFO nova.compute.manager [-] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:59:03 np0005592767 nova_compute[182623]: 2026-01-22 22:59:03.083 182627 DEBUG nova.compute.manager [None req-8831a4f1-9ab6-4cd2-940a-7aedc394c133 - - - - - -] [instance: ba4a2a05-d133-44b5-9f11-61443b261a25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:03 np0005592767 nova_compute[182623]: 2026-01-22 22:59:03.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:03 np0005592767 nova_compute[182623]: 2026-01-22 22:59:03.555 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:05 np0005592767 podman[241712]: 2026-01-22 22:59:05.147097688 +0000 UTC m=+0.061761039 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, release=1755695350, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Jan 22 17:59:05 np0005592767 podman[241711]: 2026-01-22 22:59:05.175598475 +0000 UTC m=+0.093715354 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 17:59:08 np0005592767 nova_compute[182623]: 2026-01-22 22:59:08.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:08 np0005592767 nova_compute[182623]: 2026-01-22 22:59:08.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:12.130 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:12.131 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:12.131 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:12 np0005592767 podman[241756]: 2026-01-22 22:59:12.142101038 +0000 UTC m=+0.062462119 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 17:59:12 np0005592767 podman[241757]: 2026-01-22 22:59:12.142737446 +0000 UTC m=+0.058689042 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:59:13 np0005592767 nova_compute[182623]: 2026-01-22 22:59:13.107 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:13 np0005592767 nova_compute[182623]: 2026-01-22 22:59:13.593 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:17 np0005592767 podman[241799]: 2026-01-22 22:59:17.165596312 +0000 UTC m=+0.075996933 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 17:59:18 np0005592767 nova_compute[182623]: 2026-01-22 22:59:18.108 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:18 np0005592767 nova_compute[182623]: 2026-01-22 22:59:18.636 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:23 np0005592767 nova_compute[182623]: 2026-01-22 22:59:23.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:23 np0005592767 nova_compute[182623]: 2026-01-22 22:59:23.639 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:28 np0005592767 nova_compute[182623]: 2026-01-22 22:59:28.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:28 np0005592767 nova_compute[182623]: 2026-01-22 22:59:28.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:30 np0005592767 podman[241824]: 2026-01-22 22:59:30.146267988 +0000 UTC m=+0.069395726 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 17:59:31 np0005592767 nova_compute[182623]: 2026-01-22 22:59:31.984 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:31 np0005592767 nova_compute[182623]: 2026-01-22 22:59:31.985 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:31 np0005592767 nova_compute[182623]: 2026-01-22 22:59:31.997 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.072 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.073 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.080 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.080 182627 INFO nova.compute.claims [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.196 182627 DEBUG nova.compute.provider_tree [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.213 182627 DEBUG nova.scheduler.client.report [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.237 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.238 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.303 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.304 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.320 182627 INFO nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.348 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.514 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.516 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.517 182627 INFO nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Creating image(s)#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.518 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.519 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.520 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.545 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.621 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.623 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "5c84b8e4375662442ba075bd9445186e2017954e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.624 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.650 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.729 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.730 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.767 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e,backing_fmt=raw /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.769 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "5c84b8e4375662442ba075bd9445186e2017954e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.770 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.849 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5c84b8e4375662442ba075bd9445186e2017954e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.851 182627 DEBUG nova.virt.disk.api [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Checking if we can resize image /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.852 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.920 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.922 182627 DEBUG nova.virt.disk.api [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Cannot resize image /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.922 182627 DEBUG nova.objects.instance [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'migration_context' on Instance uuid 2bde8b73-4646-4410-b525-d03b3380f77c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.941 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.941 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Ensure instance console log exists: /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.942 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.943 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:32 np0005592767 nova_compute[182623]: 2026-01-22 22:59:32.943 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:33 np0005592767 nova_compute[182623]: 2026-01-22 22:59:33.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:33 np0005592767 nova_compute[182623]: 2026-01-22 22:59:33.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:34 np0005592767 nova_compute[182623]: 2026-01-22 22:59:34.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:35 np0005592767 nova_compute[182623]: 2026-01-22 22:59:35.612 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Successfully created port: 2dbf9088-5515-4252-805a-254d35c36393 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 22 17:59:36 np0005592767 podman[241860]: 2026-01-22 22:59:36.18136018 +0000 UTC m=+0.086534781 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 22 17:59:36 np0005592767 podman[241859]: 2026-01-22 22:59:36.264401522 +0000 UTC m=+0.176109047 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.440 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Successfully updated port: 2dbf9088-5515-4252-805a-254d35c36393 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.462 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.463 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquired lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.463 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.544 182627 DEBUG nova.compute.manager [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-changed-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.545 182627 DEBUG nova.compute.manager [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Refreshing instance network info cache due to event network-changed-2dbf9088-5515-4252-805a-254d35c36393. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.546 182627 DEBUG oslo_concurrency.lockutils [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 22 17:59:37 np0005592767 nova_compute[182623]: 2026-01-22 22:59:37.650 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.544 182627 DEBUG nova.network.neutron [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Updating instance_info_cache with network_info: [{"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.567 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Releasing lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.568 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Instance network_info: |[{"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.568 182627 DEBUG oslo_concurrency.lockutils [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquired lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.568 182627 DEBUG nova.network.neutron [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Refreshing network info cache for port 2dbf9088-5515-4252-805a-254d35c36393 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.571 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Start _get_guest_xml network_info=[{"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '48dd0ec8-2856-44d4-b286-44fdc64ba78d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.577 182627 WARNING nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.584 182627 DEBUG nova.virt.libvirt.host [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.585 182627 DEBUG nova.virt.libvirt.host [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.589 182627 DEBUG nova.virt.libvirt.host [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.590 182627 DEBUG nova.virt.libvirt.host [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.591 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.591 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-22T22:14:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='63b0d901-60c2-48cb-afeb-72a71e897d3d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-22T22:14:40Z,direct_url=<?>,disk_format='qcow2',id=48dd0ec8-2856-44d4-b286-44fdc64ba78d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6912a9182ac44bb486092f7ccd64d58c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-22T22:14:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.591 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.591 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.592 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.592 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.592 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.592 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.592 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.593 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.593 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.593 182627 DEBUG nova.virt.hardware [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.596 182627 DEBUG nova.virt.libvirt.vif [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:59:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-458089501',display_name='tempest-TestServerMultinode-server-458089501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-458089501',id=187,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-ukzjy3je',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:59:32Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=2bde8b73-4646-4410-b525-d03b3380f77c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.597 182627 DEBUG nova.network.os_vif_util [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.597 182627 DEBUG nova.network.os_vif_util [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.598 182627 DEBUG nova.objects.instance [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2bde8b73-4646-4410-b525-d03b3380f77c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.612 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <uuid>2bde8b73-4646-4410-b525-d03b3380f77c</uuid>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <name>instance-000000bb</name>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <memory>131072</memory>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <vcpu>1</vcpu>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <metadata>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:name>tempest-TestServerMultinode-server-458089501</nova:name>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:creationTime>2026-01-22 22:59:38</nova:creationTime>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:flavor name="m1.nano">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:memory>128</nova:memory>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:disk>1</nova:disk>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:swap>0</nova:swap>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:ephemeral>0</nova:ephemeral>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:vcpus>1</nova:vcpus>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      </nova:flavor>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:owner>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:user uuid="30a80763458b43478ba0f621b8b501f5">tempest-TestServerMultinode-1355577921-project-admin</nova:user>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:project uuid="648f17d42fa14c7a888033544026cf49">tempest-TestServerMultinode-1355577921</nova:project>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      </nova:owner>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:root type="image" uuid="48dd0ec8-2856-44d4-b286-44fdc64ba78d"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <nova:ports>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        <nova:port uuid="2dbf9088-5515-4252-805a-254d35c36393">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:        </nova:port>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      </nova:ports>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </nova:instance>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </metadata>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <sysinfo type="smbios">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <system>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="manufacturer">RDO</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="product">OpenStack Compute</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="serial">2bde8b73-4646-4410-b525-d03b3380f77c</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="uuid">2bde8b73-4646-4410-b525-d03b3380f77c</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <entry name="family">Virtual Machine</entry>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </system>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </sysinfo>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <os>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <boot dev="hd"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <smbios mode="sysinfo"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </os>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <features>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <acpi/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <apic/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <vmcoreinfo/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </features>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <clock offset="utc">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <timer name="pit" tickpolicy="delay"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <timer name="hpet" present="no"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </clock>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <cpu mode="custom" match="exact">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <model>Nehalem</model>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <topology sockets="1" cores="1" threads="1"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </cpu>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  <devices>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <disk type="file" device="disk">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <target dev="vda" bus="virtio"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <disk type="file" device="cdrom">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <driver name="qemu" type="raw" cache="none"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <source file="/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.config"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <target dev="sda" bus="sata"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </disk>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <interface type="ethernet">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <mac address="fa:16:3e:18:64:37"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <driver name="vhost" rx_queue_size="512"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <mtu size="1442"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <target dev="tap2dbf9088-55"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </interface>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <serial type="pty">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <log file="/var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/console.log" append="off"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </serial>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <video>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <model type="virtio"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </video>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <input type="tablet" bus="usb"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <rng model="virtio">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <backend model="random">/dev/urandom</backend>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </rng>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="pci" model="pcie-root-port"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <controller type="usb" index="0"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    <memballoon model="virtio">
Jan 22 17:59:38 np0005592767 nova_compute[182623]:      <stats period="10"/>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:    </memballoon>
Jan 22 17:59:38 np0005592767 nova_compute[182623]:  </devices>
Jan 22 17:59:38 np0005592767 nova_compute[182623]: </domain>
Jan 22 17:59:38 np0005592767 nova_compute[182623]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.613 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Preparing to wait for external event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.614 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.614 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.614 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.615 182627 DEBUG nova.virt.libvirt.vif [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T22:59:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-458089501',display_name='tempest-TestServerMultinode-server-458089501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-458089501',id=187,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-ukzjy3je',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T22:59:32Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=2bde8b73-4646-4410-b525-d03b3380f77c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.615 182627 DEBUG nova.network.os_vif_util [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.616 182627 DEBUG nova.network.os_vif_util [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.616 182627 DEBUG os_vif [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.617 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.617 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.618 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.621 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.621 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dbf9088-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.621 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2dbf9088-55, col_values=(('external_ids', {'iface-id': '2dbf9088-5515-4252-805a-254d35c36393', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:64:37', 'vm-uuid': '2bde8b73-4646-4410-b525-d03b3380f77c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.623 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 NetworkManager[54973]: <info>  [1769122778.6250] manager: (tap2dbf9088-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.625 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.631 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.633 182627 INFO os_vif [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55')#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.645 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.684 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.684 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.685 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] No VIF found with MAC fa:16:3e:18:64:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.685 182627 INFO nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Using config drive#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.906 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.907 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.907 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.937 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 22 17:59:38 np0005592767 nova_compute[182623]: 2026-01-22 22:59:38.938 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.520 182627 INFO nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Creating config drive at /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.config#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.530 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw1aknqqz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.671 182627 DEBUG oslo_concurrency.processutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw1aknqqz" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:39 np0005592767 kernel: tap2dbf9088-55: entered promiscuous mode
Jan 22 17:59:39 np0005592767 NetworkManager[54973]: <info>  [1769122779.7423] manager: (tap2dbf9088-55): new Tun device (/org/freedesktop/NetworkManager/Devices/387)
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.743 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:39Z|00805|binding|INFO|Claiming lport 2dbf9088-5515-4252-805a-254d35c36393 for this chassis.
Jan 22 17:59:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:39Z|00806|binding|INFO|2dbf9088-5515-4252-805a-254d35c36393: Claiming fa:16:3e:18:64:37 10.100.0.13
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.755 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:64:37 10.100.0.13'], port_security=['fa:16:3e:18:64:37 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2bde8b73-4646-4410-b525-d03b3380f77c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82a04532-65dc-4565-8faf-3e7913e3093d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '648f17d42fa14c7a888033544026cf49', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2b775d78-5eae-44d4-acd5-c2e8bb690b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd576e24-eb7f-4cfe-9779-094eaf513ece, chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2dbf9088-5515-4252-805a-254d35c36393) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.756 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2dbf9088-5515-4252-805a-254d35c36393 in datapath 82a04532-65dc-4565-8faf-3e7913e3093d bound to our chassis#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.758 104135 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82a04532-65dc-4565-8faf-3e7913e3093d#033[00m
Jan 22 17:59:39 np0005592767 systemd-udevd[241926]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.771 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[d68c633f-de11-4c5b-bfd0-aaa797da3b98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.772 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82a04532-61 in ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.773 211609 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82a04532-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.774 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[28c0ec6e-2d28-4ffa-9bf3-6f3571fdc84d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.775 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[28065c22-3dda-491b-a580-9ffb99a85b95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 NetworkManager[54973]: <info>  [1769122779.7771] device (tap2dbf9088-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 17:59:39 np0005592767 NetworkManager[54973]: <info>  [1769122779.7776] device (tap2dbf9088-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 17:59:39 np0005592767 systemd-machined[153912]: New machine qemu-96-instance-000000bb.
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.786 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1299e9-dc2d-4b8d-94e2-9b15474cd5c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.799 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b090293b-9b80-4782-b5e6-a45d95f48e57]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.800 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:39Z|00807|binding|INFO|Setting lport 2dbf9088-5515-4252-805a-254d35c36393 ovn-installed in OVS
Jan 22 17:59:39 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:39Z|00808|binding|INFO|Setting lport 2dbf9088-5515-4252-805a-254d35c36393 up in Southbound
Jan 22 17:59:39 np0005592767 systemd[1]: Started Virtual Machine qemu-96-instance-000000bb.
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.806 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.831 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[462b76d8-9063-49b3-aa53-8eb7237d4c92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 systemd-udevd[241930]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 17:59:39 np0005592767 NetworkManager[54973]: <info>  [1769122779.8391] manager: (tap82a04532-60): new Veth device (/org/freedesktop/NetworkManager/Devices/388)
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.838 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1f7a1b-b441-45fd-893e-4549e94c19e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.872 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[eef20041-1011-4b6e-b935-17fdea5c074a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.877 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa8535a-2ee0-40da-a8d8-171c0625857d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 nova_compute[182623]: 2026-01-22 22:59:39.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:39 np0005592767 NetworkManager[54973]: <info>  [1769122779.9045] device (tap82a04532-60): carrier: link connected
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.909 211648 DEBUG oslo.privsep.daemon [-] privsep: reply[72ad8e81-1543-4ca0-9faa-32dac9c0337e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.926 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[6862f15e-05df-4391-bbdd-990dcdef328c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82a04532-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:78:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637648, 'reachable_time': 24932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241960, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.946 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c98ebf-ef39-4f33-a864-b8ded30f1f38]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:783b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637648, 'tstamp': 637648}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241961, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:39 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:39.967 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[337257a4-63ae-4816-8ce8-bbd6bb8cd7c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82a04532-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:78:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637648, 'reachable_time': 24932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241962, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.000 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[748c4945-183c-4112-94d3-077f1511e03f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.068 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[41f9a086-c644-41bd-859b-6c5f5978a569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.069 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82a04532-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.070 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.070 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82a04532-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:40 np0005592767 NetworkManager[54973]: <info>  [1769122780.0726] manager: (tap82a04532-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Jan 22 17:59:40 np0005592767 kernel: tap82a04532-60: entered promiscuous mode
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.078 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82a04532-60, col_values=(('external_ids', {'iface-id': 'e9aceb3b-3bda-4638-9b85-f2d0610c9277'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.080 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:40 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:40Z|00809|binding|INFO|Releasing lport e9aceb3b-3bda-4638-9b85-f2d0610c9277 from this chassis (sb_readonly=0)
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.081 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.082 104135 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.083 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0f082e53-96a2-46ed-8ee2-7b9e1bc17b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.084 104135 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: global
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    log         /dev/log local0 debug
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    log-tag     haproxy-metadata-proxy-82a04532-65dc-4565-8faf-3e7913e3093d
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    user        root
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    group       root
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    maxconn     1024
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    pidfile     /var/lib/neutron/external/pids/82a04532-65dc-4565-8faf-3e7913e3093d.pid.haproxy
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    daemon
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: defaults
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    log global
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    mode http
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    option httplog
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    option dontlognull
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    option http-server-close
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    option forwardfor
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    retries                 3
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    timeout http-request    30s
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    timeout connect         30s
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    timeout client          32s
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    timeout server          32s
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    timeout http-keep-alive 30s
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: listen listener
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    bind 169.254.169.254:80
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    server metadata /var/lib/neutron/metadata_proxy
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]:    http-request add-header X-OVN-Network-ID 82a04532-65dc-4565-8faf-3e7913e3093d
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 22 17:59:40 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:40.085 104135 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'env', 'PROCESS_TAG=haproxy-82a04532-65dc-4565-8faf-3e7913e3093d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82a04532-65dc-4565-8faf-3e7913e3093d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.092 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.106 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122780.1056342, 2bde8b73-4646-4410-b525-d03b3380f77c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.106 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] VM Started (Lifecycle Event)#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.125 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.130 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122780.1058764, 2bde8b73-4646-4410-b525-d03b3380f77c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.130 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] VM Paused (Lifecycle Event)#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.143 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.146 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.162 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:59:40 np0005592767 podman[242001]: 2026-01-22 22:59:40.496451698 +0000 UTC m=+0.091429409 container create 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:59:40 np0005592767 podman[242001]: 2026-01-22 22:59:40.424683466 +0000 UTC m=+0.019661207 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 17:59:40 np0005592767 systemd[1]: Started libpod-conmon-9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b.scope.
Jan 22 17:59:40 np0005592767 systemd[1]: Started libcrun container.
Jan 22 17:59:40 np0005592767 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa1f4d3cc5964f40ce0a0a9116a02084b50d62e849271ceb25a5883590648d41/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.610 182627 DEBUG nova.compute.manager [req-848a5f90-3384-4f8d-b7fe-cc2363bb895e req-2905778e-c997-49ce-95da-d22aa0bd2380 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.611 182627 DEBUG oslo_concurrency.lockutils [req-848a5f90-3384-4f8d-b7fe-cc2363bb895e req-2905778e-c997-49ce-95da-d22aa0bd2380 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.611 182627 DEBUG oslo_concurrency.lockutils [req-848a5f90-3384-4f8d-b7fe-cc2363bb895e req-2905778e-c997-49ce-95da-d22aa0bd2380 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.612 182627 DEBUG oslo_concurrency.lockutils [req-848a5f90-3384-4f8d-b7fe-cc2363bb895e req-2905778e-c997-49ce-95da-d22aa0bd2380 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.612 182627 DEBUG nova.compute.manager [req-848a5f90-3384-4f8d-b7fe-cc2363bb895e req-2905778e-c997-49ce-95da-d22aa0bd2380 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Processing event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.613 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.616 182627 DEBUG nova.virt.driver [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] Emitting event <LifecycleEvent: 1769122780.6165936, 2bde8b73-4646-4410-b525-d03b3380f77c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.617 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] VM Resumed (Lifecycle Event)#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.619 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.622 182627 INFO nova.virt.libvirt.driver [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Instance spawned successfully.#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.622 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 22 17:59:40 np0005592767 podman[242001]: 2026-01-22 22:59:40.636454362 +0000 UTC m=+0.231432093 container init 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 17:59:40 np0005592767 podman[242001]: 2026-01-22 22:59:40.642048141 +0000 UTC m=+0.237025852 container start 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 17:59:40 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [NOTICE]   (242020) : New worker (242022) forked
Jan 22 17:59:40 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [NOTICE]   (242020) : Loading success.
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.662 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.663 182627 DEBUG nova.network.neutron [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Updated VIF entry in instance network info cache for port 2dbf9088-5515-4252-805a-254d35c36393. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.664 182627 DEBUG nova.network.neutron [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Updating instance_info_cache with network_info: [{"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.675 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.676 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.676 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.676 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.677 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.677 182627 DEBUG nova.virt.libvirt.driver [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.681 182627 DEBUG nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.730 182627 DEBUG oslo_concurrency.lockutils [req-4d514d64-6169-4a9e-b416-ec3d7b8c498c req-6803bc2b-32a1-4271-95e1-71ae12434421 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Releasing lock "refresh_cache-2bde8b73-4646-4410-b525-d03b3380f77c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.796 182627 INFO nova.compute.manager [None req-1a75e5fd-c79b-46f8-98e0-35b710204721 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.972 182627 INFO nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Took 8.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 22 17:59:40 np0005592767 nova_compute[182623]: 2026-01-22 22:59:40.973 182627 DEBUG nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.181 182627 INFO nova.compute.manager [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Took 9.14 seconds to build instance.#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.196 182627 DEBUG oslo_concurrency.lockutils [None req-aee856c8-c472-4754-8e1b-0c8b7944f3e7 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.915 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.915 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.916 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.916 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 17:59:41 np0005592767 nova_compute[182623]: 2026-01-22 22:59:41.984 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.070 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.072 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.147 182627 DEBUG oslo_concurrency.processutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.305 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.307 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5537MB free_disk=73.05026245117188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.307 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.308 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.539 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Instance 2bde8b73-4646-4410-b525-d03b3380f77c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.539 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.540 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.573 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.574 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.575 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.575 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.576 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.598 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.601 182627 INFO nova.compute.manager [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Terminating instance#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.621 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.626 182627 DEBUG nova.compute.manager [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 22 17:59:42 np0005592767 kernel: tap2dbf9088-55 (unregistering): left promiscuous mode
Jan 22 17:59:42 np0005592767 NetworkManager[54973]: <info>  [1769122782.6534] device (tap2dbf9088-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.657 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.657 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.662 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:42Z|00810|binding|INFO|Releasing lport 2dbf9088-5515-4252-805a-254d35c36393 from this chassis (sb_readonly=0)
Jan 22 17:59:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:42Z|00811|binding|INFO|Setting lport 2dbf9088-5515-4252-805a-254d35c36393 down in Southbound
Jan 22 17:59:42 np0005592767 ovn_controller[94769]: 2026-01-22T22:59:42Z|00812|binding|INFO|Removing iface tap2dbf9088-55 ovn-installed in OVS
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.678 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:64:37 10.100.0.13'], port_security=['fa:16:3e:18:64:37 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2bde8b73-4646-4410-b525-d03b3380f77c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82a04532-65dc-4565-8faf-3e7913e3093d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '648f17d42fa14c7a888033544026cf49', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2b775d78-5eae-44d4-acd5-c2e8bb690b1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd576e24-eb7f-4cfe-9779-094eaf513ece, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1308162730>], logical_port=2dbf9088-5515-4252-805a-254d35c36393) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1308162730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.680 104135 INFO neutron.agent.ovn.metadata.agent [-] Port 2dbf9088-5515-4252-805a-254d35c36393 in datapath 82a04532-65dc-4565-8faf-3e7913e3093d unbound from our chassis#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.682 104135 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82a04532-65dc-4565-8faf-3e7913e3093d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.684 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[bc60d362-d63e-4948-afa5-9d9e60558706]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.684 104135 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d namespace which is not needed anymore#033[00m
Jan 22 17:59:42 np0005592767 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Jan 22 17:59:42 np0005592767 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000bb.scope: Consumed 2.281s CPU time.
Jan 22 17:59:42 np0005592767 systemd-machined[153912]: Machine qemu-96-instance-000000bb terminated.
Jan 22 17:59:42 np0005592767 podman[242039]: 2026-01-22 22:59:42.72882145 +0000 UTC m=+0.055686597 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.736 182627 DEBUG nova.compute.manager [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.736 182627 DEBUG oslo_concurrency.lockutils [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.737 182627 DEBUG oslo_concurrency.lockutils [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.737 182627 DEBUG oslo_concurrency.lockutils [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.737 182627 DEBUG nova.compute.manager [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] No waiting events found dispatching network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.737 182627 WARNING nova.compute.manager [req-dc354d10-c8ca-46fd-a187-61345685d5ec req-6e1dc5b6-a747-42e8-bd0a-7613302342a7 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received unexpected event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 for instance with vm_state active and task_state deleting.#033[00m
Jan 22 17:59:42 np0005592767 podman[242045]: 2026-01-22 22:59:42.763816481 +0000 UTC m=+0.076878498 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 17:59:42 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [NOTICE]   (242020) : haproxy version is 2.8.14-c23fe91
Jan 22 17:59:42 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [NOTICE]   (242020) : path to executable is /usr/sbin/haproxy
Jan 22 17:59:42 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [WARNING]  (242020) : Exiting Master process...
Jan 22 17:59:42 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [ALERT]    (242020) : Current worker (242022) exited with code 143 (Terminated)
Jan 22 17:59:42 np0005592767 neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d[242016]: [WARNING]  (242020) : All workers exited. Exiting... (0)
Jan 22 17:59:42 np0005592767 systemd[1]: libpod-9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b.scope: Deactivated successfully.
Jan 22 17:59:42 np0005592767 conmon[242016]: conmon 9aa03eb7fc9ae9c05453 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b.scope/container/memory.events
Jan 22 17:59:42 np0005592767 podman[242104]: 2026-01-22 22:59:42.837461036 +0000 UTC m=+0.060700850 container died 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 17:59:42 np0005592767 NetworkManager[54973]: <info>  [1769122782.8463] manager: (tap2dbf9088-55): new Tun device (/org/freedesktop/NetworkManager/Devices/390)
Jan 22 17:59:42 np0005592767 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b-userdata-shm.mount: Deactivated successfully.
Jan 22 17:59:42 np0005592767 systemd[1]: var-lib-containers-storage-overlay-aa1f4d3cc5964f40ce0a0a9116a02084b50d62e849271ceb25a5883590648d41-merged.mount: Deactivated successfully.
Jan 22 17:59:42 np0005592767 podman[242104]: 2026-01-22 22:59:42.877714216 +0000 UTC m=+0.100954020 container cleanup 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.890 182627 INFO nova.virt.libvirt.driver [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Instance destroyed successfully.#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.890 182627 DEBUG nova.objects.instance [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lazy-loading 'resources' on Instance uuid 2bde8b73-4646-4410-b525-d03b3380f77c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 22 17:59:42 np0005592767 systemd[1]: libpod-conmon-9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b.scope: Deactivated successfully.
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.905 182627 DEBUG nova.virt.libvirt.vif [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T22:59:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-458089501',display_name='tempest-TestServerMultinode-server-458089501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-458089501',id=187,image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T22:59:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='648f17d42fa14c7a888033544026cf49',ramdisk_id='',reservation_id='r-ukzjy3je',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='48dd0ec8-2856-44d4-b286-44fdc64ba78d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1355577921',owner_user_name='tempest-TestServerMultinode-1355577921-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T22:59:41Z,user_data=None,user_id='30a80763458b43478ba0f621b8b501f5',uuid=2bde8b73-4646-4410-b525-d03b3380f77c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.906 182627 DEBUG nova.network.os_vif_util [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converting VIF {"id": "2dbf9088-5515-4252-805a-254d35c36393", "address": "fa:16:3e:18:64:37", "network": {"id": "82a04532-65dc-4565-8faf-3e7913e3093d", "bridge": "br-int", "label": "tempest-TestServerMultinode-2036800446-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb9aae3077de4c17aa17a59f6071e935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2dbf9088-55", "ovs_interfaceid": "2dbf9088-5515-4252-805a-254d35c36393", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.906 182627 DEBUG nova.network.os_vif_util [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.907 182627 DEBUG os_vif [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.909 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.910 182627 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dbf9088-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.916 182627 INFO os_vif [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:64:37,bridge_name='br-int',has_traffic_filtering=True,id=2dbf9088-5515-4252-805a-254d35c36393,network=Network(82a04532-65dc-4565-8faf-3e7913e3093d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2dbf9088-55')#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.917 182627 INFO nova.virt.libvirt.driver [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Deleting instance files /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c_del#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.918 182627 INFO nova.virt.libvirt.driver [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Deletion of /var/lib/nova/instances/2bde8b73-4646-4410-b525-d03b3380f77c_del complete#033[00m
Jan 22 17:59:42 np0005592767 podman[242143]: 2026-01-22 22:59:42.939741462 +0000 UTC m=+0.043614056 container remove 9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.944 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[745075c8-01ec-43e7-b87f-6d4227d2f5f8]: (4, ('Thu Jan 22 10:59:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d (9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b)\n9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b\nThu Jan 22 10:59:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d (9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b)\n9aa03eb7fc9ae9c05453e449ddb75c015dee1f745e7564e7c64a7941d3384e7b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.946 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[0e77beb1-d776-462b-aa3b-614bdd82ee83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.947 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82a04532-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.948 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:42 np0005592767 kernel: tap82a04532-60: left promiscuous mode
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.962 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[71058c26-2888-455c-a76d-f21a9598f93c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.975 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[75fce1da-ac7b-4e9f-97c4-4976129764de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.977 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[e23c4b44-3271-4479-9cfa-97b4d10d8588]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.990 182627 INFO nova.compute.manager [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.991 182627 DEBUG oslo.service.loopingcall [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.991 182627 DEBUG nova.compute.manager [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 22 17:59:42 np0005592767 nova_compute[182623]: 2026-01-22 22:59:42.991 182627 DEBUG nova.network.neutron [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.991 211609 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f45145-a92f-47e8-b33c-c632ab8bf685]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637640, 'reachable_time': 33235, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242166, 'error': None, 'target': 'ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.993 104518 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82a04532-65dc-4565-8faf-3e7913e3093d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 22 17:59:42 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:42.993 104518 DEBUG oslo.privsep.daemon [-] privsep: reply[6575282c-9682-40bf-8f46-43c79cb37c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 22 17:59:42 np0005592767 systemd[1]: run-netns-ovnmeta\x2d82a04532\x2d65dc\x2d4565\x2d8faf\x2d3e7913e3093d.mount: Deactivated successfully.
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.647 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.878 182627 DEBUG nova.network.neutron [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.912 182627 INFO nova.compute.manager [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Took 0.92 seconds to deallocate network for instance.#033[00m
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.979 182627 DEBUG nova.compute.manager [req-c512e488-9f7a-4fcd-83f2-1319b4ed22d4 req-1f52a09e-a976-4773-ac39-aaaba054a4a1 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-vif-deleted-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.995 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:43 np0005592767 nova_compute[182623]: 2026-01-22 22:59:43.995 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.042 182627 DEBUG nova.compute.provider_tree [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.057 182627 DEBUG nova.scheduler.client.report [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.076 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.095 182627 INFO nova.scheduler.client.report [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Deleted allocations for instance 2bde8b73-4646-4410-b525-d03b3380f77c#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.176 182627 DEBUG oslo_concurrency.lockutils [None req-7bbfbce4-c7f7-4aa0-94d5-40d99f73940e 30a80763458b43478ba0f621b8b501f5 648f17d42fa14c7a888033544026cf49 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.658 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.854 182627 DEBUG nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-vif-unplugged-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.855 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.856 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.856 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.857 182627 DEBUG nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] No waiting events found dispatching network-vif-unplugged-2dbf9088-5515-4252-805a-254d35c36393 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.858 182627 WARNING nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received unexpected event network-vif-unplugged-2dbf9088-5515-4252-805a-254d35c36393 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.858 182627 DEBUG nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.859 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Acquiring lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.859 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.860 182627 DEBUG oslo_concurrency.lockutils [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] Lock "2bde8b73-4646-4410-b525-d03b3380f77c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.860 182627 DEBUG nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] No waiting events found dispatching network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.861 182627 WARNING nova.compute.manager [req-87c61e76-7fe1-4204-8428-5ba3ed9e4969 req-0e636225-745a-443b-a8d7-c078429e9c95 6dec9a77c953419d95fc70959307a00d 527022705ec9474bacb3ece1a162f8b7 - - default default] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Received unexpected event network-vif-plugged-2dbf9088-5515-4252-805a-254d35c36393 for instance with vm_state deleted and task_state None.#033[00m
Jan 22 17:59:44 np0005592767 nova_compute[182623]: 2026-01-22 22:59:44.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:47 np0005592767 nova_compute[182623]: 2026-01-22 22:59:47.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:47 np0005592767 nova_compute[182623]: 2026-01-22 22:59:47.913 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:48 np0005592767 podman[242167]: 2026-01-22 22:59:48.157791695 +0000 UTC m=+0.075848319 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 17:59:48 np0005592767 nova_compute[182623]: 2026-01-22 22:59:48.650 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:49 np0005592767 nova_compute[182623]: 2026-01-22 22:59:49.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:49 np0005592767 nova_compute[182623]: 2026-01-22 22:59:49.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 17:59:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:51.436 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 17:59:51 np0005592767 nova_compute[182623]: 2026-01-22 22:59:51.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:51 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:51.438 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 17:59:51 np0005592767 nova_compute[182623]: 2026-01-22 22:59:51.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:52 np0005592767 nova_compute[182623]: 2026-01-22 22:59:52.916 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:53 np0005592767 nova_compute[182623]: 2026-01-22 22:59:53.651 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:54 np0005592767 nova_compute[182623]: 2026-01-22 22:59:54.150 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:57 np0005592767 nova_compute[182623]: 2026-01-22 22:59:57.889 182627 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769122782.8872232, 2bde8b73-4646-4410-b525-d03b3380f77c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 22 17:59:57 np0005592767 nova_compute[182623]: 2026-01-22 22:59:57.890 182627 INFO nova.compute.manager [-] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] VM Stopped (Lifecycle Event)#033[00m
Jan 22 17:59:57 np0005592767 nova_compute[182623]: 2026-01-22 22:59:57.908 182627 DEBUG nova.compute.manager [None req-df450d00-5d85-4cb8-8172-975d32f12962 - - - - - -] [instance: 2bde8b73-4646-4410-b525-d03b3380f77c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 22 17:59:57 np0005592767 nova_compute[182623]: 2026-01-22 22:59:57.919 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:58 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 22:59:58.441 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 17:59:58 np0005592767 nova_compute[182623]: 2026-01-22 22:59:58.654 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 17:59:58 np0005592767 nova_compute[182623]: 2026-01-22 22:59:58.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:58 np0005592767 nova_compute[182623]: 2026-01-22 22:59:58.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 17:59:59 np0005592767 nova_compute[182623]: 2026-01-22 22:59:59.910 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:59 np0005592767 nova_compute[182623]: 2026-01-22 22:59:59.911 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 17:59:59 np0005592767 nova_compute[182623]: 2026-01-22 22:59:59.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 17:59:59 np0005592767 nova_compute[182623]: 2026-01-22 22:59:59.935 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 18:00:00 np0005592767 nova_compute[182623]: 2026-01-22 23:00:00.916 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:01 np0005592767 podman[242194]: 2026-01-22 23:00:01.135106924 +0000 UTC m=+0.056180021 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 18:00:02 np0005592767 nova_compute[182623]: 2026-01-22 23:00:02.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:03 np0005592767 nova_compute[182623]: 2026-01-22 23:00:03.655 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:07 np0005592767 podman[242214]: 2026-01-22 23:00:07.171157284 +0000 UTC m=+0.091310776 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 18:00:07 np0005592767 podman[242215]: 2026-01-22 23:00:07.182340721 +0000 UTC m=+0.090066011 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:00:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:00:07 np0005592767 nova_compute[182623]: 2026-01-22 23:00:07.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:08 np0005592767 nova_compute[182623]: 2026-01-22 23:00:08.657 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:00:12.131 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:00:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:00:12.133 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:00:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:00:12.133 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:00:12 np0005592767 nova_compute[182623]: 2026-01-22 23:00:12.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:13 np0005592767 podman[242260]: 2026-01-22 23:00:13.140929099 +0000 UTC m=+0.052177528 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:00:13 np0005592767 podman[242259]: 2026-01-22 23:00:13.169613191 +0000 UTC m=+0.077630919 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 18:00:13 np0005592767 nova_compute[182623]: 2026-01-22 23:00:13.701 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:17 np0005592767 nova_compute[182623]: 2026-01-22 23:00:17.933 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:18 np0005592767 nova_compute[182623]: 2026-01-22 23:00:18.704 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:19 np0005592767 podman[242303]: 2026-01-22 23:00:19.134022914 +0000 UTC m=+0.055446981 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:00:22 np0005592767 nova_compute[182623]: 2026-01-22 23:00:22.966 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:23 np0005592767 nova_compute[182623]: 2026-01-22 23:00:23.706 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:27 np0005592767 nova_compute[182623]: 2026-01-22 23:00:27.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:28 np0005592767 ovn_controller[94769]: 2026-01-22T23:00:28Z|00813|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 22 18:00:28 np0005592767 nova_compute[182623]: 2026-01-22 23:00:28.746 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:32 np0005592767 podman[242328]: 2026-01-22 23:00:32.147642032 +0000 UTC m=+0.065261169 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 18:00:32 np0005592767 nova_compute[182623]: 2026-01-22 23:00:32.970 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:33 np0005592767 nova_compute[182623]: 2026-01-22 23:00:33.751 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:37 np0005592767 nova_compute[182623]: 2026-01-22 23:00:37.998 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:38 np0005592767 podman[242351]: 2026-01-22 23:00:38.17896489 +0000 UTC m=+0.094244990 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 18:00:38 np0005592767 podman[242350]: 2026-01-22 23:00:38.235446329 +0000 UTC m=+0.146902650 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:00:38 np0005592767 nova_compute[182623]: 2026-01-22 23:00:38.752 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:40 np0005592767 nova_compute[182623]: 2026-01-22 23:00:40.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:40 np0005592767 nova_compute[182623]: 2026-01-22 23:00:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:00:40 np0005592767 nova_compute[182623]: 2026-01-22 23:00:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:00:40 np0005592767 nova_compute[182623]: 2026-01-22 23:00:40.915 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:00:40 np0005592767 nova_compute[182623]: 2026-01-22 23:00:40.915 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:41 np0005592767 nova_compute[182623]: 2026-01-22 23:00:41.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:41 np0005592767 nova_compute[182623]: 2026-01-22 23:00:41.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:00:41 np0005592767 nova_compute[182623]: 2026-01-22 23:00:41.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:00:41 np0005592767 nova_compute[182623]: 2026-01-22 23:00:41.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:00:41 np0005592767 nova_compute[182623]: 2026-01-22 23:00:41.920 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.100 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.101 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5716MB free_disk=73.05118942260742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.101 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.101 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.301 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.301 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.325 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.339 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.360 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:00:42 np0005592767 nova_compute[182623]: 2026-01-22 23:00:42.360 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:00:43 np0005592767 nova_compute[182623]: 2026-01-22 23:00:43.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:43 np0005592767 nova_compute[182623]: 2026-01-22 23:00:43.755 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:44 np0005592767 podman[242397]: 2026-01-22 23:00:44.14366412 +0000 UTC m=+0.060105652 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:00:44 np0005592767 podman[242396]: 2026-01-22 23:00:44.153139919 +0000 UTC m=+0.073957695 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:00:45 np0005592767 nova_compute[182623]: 2026-01-22 23:00:45.361 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:45 np0005592767 nova_compute[182623]: 2026-01-22 23:00:45.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:47 np0005592767 nova_compute[182623]: 2026-01-22 23:00:47.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:48 np0005592767 nova_compute[182623]: 2026-01-22 23:00:48.004 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:48 np0005592767 nova_compute[182623]: 2026-01-22 23:00:48.756 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:50 np0005592767 podman[242437]: 2026-01-22 23:00:50.147979732 +0000 UTC m=+0.061341608 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:00:51 np0005592767 nova_compute[182623]: 2026-01-22 23:00:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:51 np0005592767 nova_compute[182623]: 2026-01-22 23:00:51.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:00:53 np0005592767 nova_compute[182623]: 2026-01-22 23:00:53.007 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:53 np0005592767 nova_compute[182623]: 2026-01-22 23:00:53.647 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:53 np0005592767 nova_compute[182623]: 2026-01-22 23:00:53.757 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:53 np0005592767 nova_compute[182623]: 2026-01-22 23:00:53.911 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:00:58 np0005592767 nova_compute[182623]: 2026-01-22 23:00:58.010 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:58 np0005592767 nova_compute[182623]: 2026-01-22 23:00:58.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:00:59 np0005592767 nova_compute[182623]: 2026-01-22 23:00:59.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:03 np0005592767 nova_compute[182623]: 2026-01-22 23:01:03.014 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:03 np0005592767 podman[242472]: 2026-01-22 23:01:03.130938852 +0000 UTC m=+0.055107601 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 18:01:03 np0005592767 nova_compute[182623]: 2026-01-22 23:01:03.760 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:08 np0005592767 nova_compute[182623]: 2026-01-22 23:01:08.016 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:08 np0005592767 nova_compute[182623]: 2026-01-22 23:01:08.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:09 np0005592767 podman[242493]: 2026-01-22 23:01:09.154004617 +0000 UTC m=+0.061820352 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, distribution-scope=public)
Jan 22 18:01:09 np0005592767 podman[242492]: 2026-01-22 23:01:09.23608478 +0000 UTC m=+0.139958043 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:12.133 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:12.133 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:12.134 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:12.754 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 18:01:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:12.755 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 18:01:12 np0005592767 nova_compute[182623]: 2026-01-22 23:01:12.755 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:13 np0005592767 nova_compute[182623]: 2026-01-22 23:01:13.018 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:13 np0005592767 nova_compute[182623]: 2026-01-22 23:01:13.764 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:15 np0005592767 podman[242539]: 2026-01-22 23:01:15.128992968 +0000 UTC m=+0.051994023 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:01:15 np0005592767 podman[242538]: 2026-01-22 23:01:15.164219835 +0000 UTC m=+0.085761829 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 18:01:18 np0005592767 nova_compute[182623]: 2026-01-22 23:01:18.020 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:18 np0005592767 nova_compute[182623]: 2026-01-22 23:01:18.767 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:19 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:19.757 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 18:01:21 np0005592767 podman[242577]: 2026-01-22 23:01:21.135230895 +0000 UTC m=+0.054736711 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:01:23 np0005592767 nova_compute[182623]: 2026-01-22 23:01:23.022 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:23 np0005592767 nova_compute[182623]: 2026-01-22 23:01:23.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:28 np0005592767 nova_compute[182623]: 2026-01-22 23:01:28.024 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:28 np0005592767 nova_compute[182623]: 2026-01-22 23:01:28.779 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:33 np0005592767 nova_compute[182623]: 2026-01-22 23:01:33.026 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:33 np0005592767 nova_compute[182623]: 2026-01-22 23:01:33.782 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:34 np0005592767 podman[242601]: 2026-01-22 23:01:34.145672483 +0000 UTC m=+0.061589794 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 22 18:01:38 np0005592767 nova_compute[182623]: 2026-01-22 23:01:38.029 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:38 np0005592767 nova_compute[182623]: 2026-01-22 23:01:38.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:40 np0005592767 podman[242622]: 2026-01-22 23:01:40.152781785 +0000 UTC m=+0.063090388 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:01:40 np0005592767 podman[242621]: 2026-01-22 23:01:40.157973572 +0000 UTC m=+0.081016375 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 18:01:40 np0005592767 nova_compute[182623]: 2026-01-22 23:01:40.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:40 np0005592767 nova_compute[182623]: 2026-01-22 23:01:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:01:40 np0005592767 nova_compute[182623]: 2026-01-22 23:01:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:01:40 np0005592767 nova_compute[182623]: 2026-01-22 23:01:40.916 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:01:42 np0005592767 nova_compute[182623]: 2026-01-22 23:01:42.920 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.059 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.061 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.05118942260742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.061 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.061 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.208 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.209 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.224 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.242 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.243 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.267 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.299 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.320 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.334 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.335 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.336 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:01:43 np0005592767 nova_compute[182623]: 2026-01-22 23:01:43.786 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:46 np0005592767 podman[242667]: 2026-01-22 23:01:46.135064583 +0000 UTC m=+0.057766207 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:01:46 np0005592767 podman[242668]: 2026-01-22 23:01:46.141958368 +0000 UTC m=+0.060237476 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:01:46 np0005592767 nova_compute[182623]: 2026-01-22 23:01:46.336 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:46 np0005592767 nova_compute[182623]: 2026-01-22 23:01:46.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:47 np0005592767 nova_compute[182623]: 2026-01-22 23:01:47.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:48 np0005592767 nova_compute[182623]: 2026-01-22 23:01:48.031 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:48 np0005592767 nova_compute[182623]: 2026-01-22 23:01:48.788 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:51 np0005592767 nova_compute[182623]: 2026-01-22 23:01:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:51 np0005592767 nova_compute[182623]: 2026-01-22 23:01:51.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:01:52 np0005592767 podman[242711]: 2026-01-22 23:01:52.153381321 +0000 UTC m=+0.072581426 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:01:53 np0005592767 nova_compute[182623]: 2026-01-22 23:01:53.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:53 np0005592767 nova_compute[182623]: 2026-01-22 23:01:53.789 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:54.688 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 18:01:54 np0005592767 nova_compute[182623]: 2026-01-22 23:01:54.689 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:54 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:54.691 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 18:01:54 np0005592767 nova_compute[182623]: 2026-01-22 23:01:54.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:01:56 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:01:56.693 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 18:01:58 np0005592767 nova_compute[182623]: 2026-01-22 23:01:58.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:01:58 np0005592767 nova_compute[182623]: 2026-01-22 23:01:58.792 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:00 np0005592767 nova_compute[182623]: 2026-01-22 23:02:00.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:02 np0005592767 nova_compute[182623]: 2026-01-22 23:02:02.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:03 np0005592767 nova_compute[182623]: 2026-01-22 23:02:03.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:03 np0005592767 nova_compute[182623]: 2026-01-22 23:02:03.794 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:05 np0005592767 podman[242735]: 2026-01-22 23:02:05.124494633 +0000 UTC m=+0.048457561 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:02:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:02:08 np0005592767 nova_compute[182623]: 2026-01-22 23:02:08.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:08 np0005592767 nova_compute[182623]: 2026-01-22 23:02:08.795 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:11 np0005592767 podman[242756]: 2026-01-22 23:02:11.201617599 +0000 UTC m=+0.103052346 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 22 18:02:11 np0005592767 podman[242755]: 2026-01-22 23:02:11.210278814 +0000 UTC m=+0.117601047 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:02:12.135 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:02:12.135 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:02:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:02:12.135 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:02:13 np0005592767 nova_compute[182623]: 2026-01-22 23:02:13.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:13 np0005592767 nova_compute[182623]: 2026-01-22 23:02:13.797 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:17 np0005592767 podman[242803]: 2026-01-22 23:02:17.169708649 +0000 UTC m=+0.083267056 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 18:02:17 np0005592767 podman[242804]: 2026-01-22 23:02:17.179864516 +0000 UTC m=+0.086966670 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:02:18 np0005592767 nova_compute[182623]: 2026-01-22 23:02:18.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:18 np0005592767 nova_compute[182623]: 2026-01-22 23:02:18.801 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:23 np0005592767 nova_compute[182623]: 2026-01-22 23:02:23.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:23 np0005592767 podman[242846]: 2026-01-22 23:02:23.200647106 +0000 UTC m=+0.113258674 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:02:23 np0005592767 nova_compute[182623]: 2026-01-22 23:02:23.803 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:28 np0005592767 nova_compute[182623]: 2026-01-22 23:02:28.054 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:28 np0005592767 nova_compute[182623]: 2026-01-22 23:02:28.806 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:33 np0005592767 nova_compute[182623]: 2026-01-22 23:02:33.058 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:33 np0005592767 nova_compute[182623]: 2026-01-22 23:02:33.843 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:36 np0005592767 podman[242872]: 2026-01-22 23:02:36.150323083 +0000 UTC m=+0.066532113 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 18:02:38 np0005592767 nova_compute[182623]: 2026-01-22 23:02:38.062 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:38 np0005592767 nova_compute[182623]: 2026-01-22 23:02:38.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:40 np0005592767 nova_compute[182623]: 2026-01-22 23:02:40.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:40 np0005592767 nova_compute[182623]: 2026-01-22 23:02:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:02:40 np0005592767 nova_compute[182623]: 2026-01-22 23:02:40.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:02:40 np0005592767 nova_compute[182623]: 2026-01-22 23:02:40.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:02:42 np0005592767 podman[242894]: 2026-01-22 23:02:42.150293804 +0000 UTC m=+0.062280922 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 22 18:02:42 np0005592767 podman[242893]: 2026-01-22 23:02:42.166307287 +0000 UTC m=+0.084576163 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:02:42 np0005592767 nova_compute[182623]: 2026-01-22 23:02:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:42 np0005592767 nova_compute[182623]: 2026-01-22 23:02:42.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:02:42 np0005592767 nova_compute[182623]: 2026-01-22 23:02:42.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:02:42 np0005592767 nova_compute[182623]: 2026-01-22 23:02:42.960 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:02:42 np0005592767 nova_compute[182623]: 2026-01-22 23:02:42.961 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.083 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.132 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.132 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5717MB free_disk=73.05154800415039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.133 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.133 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.194 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.195 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.216 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.228 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.229 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.229 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:02:43 np0005592767 nova_compute[182623]: 2026-01-22 23:02:43.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:44 np0005592767 nova_compute[182623]: 2026-01-22 23:02:44.230 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:46 np0005592767 nova_compute[182623]: 2026-01-22 23:02:46.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:48 np0005592767 nova_compute[182623]: 2026-01-22 23:02:48.086 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:48 np0005592767 podman[242941]: 2026-01-22 23:02:48.146000805 +0000 UTC m=+0.056103378 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:02:48 np0005592767 podman[242942]: 2026-01-22 23:02:48.153769804 +0000 UTC m=+0.068771236 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:02:48 np0005592767 nova_compute[182623]: 2026-01-22 23:02:48.850 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:48 np0005592767 nova_compute[182623]: 2026-01-22 23:02:48.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:48 np0005592767 nova_compute[182623]: 2026-01-22 23:02:48.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:52 np0005592767 nova_compute[182623]: 2026-01-22 23:02:52.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:52 np0005592767 nova_compute[182623]: 2026-01-22 23:02:52.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:02:53 np0005592767 nova_compute[182623]: 2026-01-22 23:02:53.088 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:53 np0005592767 nova_compute[182623]: 2026-01-22 23:02:53.853 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:54 np0005592767 podman[242985]: 2026-01-22 23:02:54.137727085 +0000 UTC m=+0.062835609 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:02:54 np0005592767 nova_compute[182623]: 2026-01-22 23:02:54.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:02:58 np0005592767 nova_compute[182623]: 2026-01-22 23:02:58.090 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:02:58 np0005592767 nova_compute[182623]: 2026-01-22 23:02:58.854 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:02 np0005592767 nova_compute[182623]: 2026-01-22 23:03:02.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:03 np0005592767 nova_compute[182623]: 2026-01-22 23:03:03.139 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:03 np0005592767 nova_compute[182623]: 2026-01-22 23:03:03.855 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:07 np0005592767 podman[243009]: 2026-01-22 23:03:07.131063965 +0000 UTC m=+0.056367355 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 18:03:08 np0005592767 nova_compute[182623]: 2026-01-22 23:03:08.142 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:08 np0005592767 nova_compute[182623]: 2026-01-22 23:03:08.858 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:03:12.136 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:03:12.136 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:03:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:03:12.136 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:03:13 np0005592767 podman[243032]: 2026-01-22 23:03:13.14602211 +0000 UTC m=+0.066251804 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:03:13 np0005592767 nova_compute[182623]: 2026-01-22 23:03:13.146 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:13 np0005592767 podman[243031]: 2026-01-22 23:03:13.174003322 +0000 UTC m=+0.090392998 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 18:03:13 np0005592767 systemd-logind[802]: New session 71 of user zuul.
Jan 22 18:03:13 np0005592767 systemd[1]: Started Session 71 of User zuul.
Jan 22 18:03:13 np0005592767 nova_compute[182623]: 2026-01-22 23:03:13.859 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:18 np0005592767 nova_compute[182623]: 2026-01-22 23:03:18.156 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:18 np0005592767 ovs-vsctl[243252]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 18:03:18 np0005592767 nova_compute[182623]: 2026-01-22 23:03:18.862 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:18 np0005592767 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 243106 (sos)
Jan 22 18:03:18 np0005592767 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 18:03:18 np0005592767 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 18:03:19 np0005592767 podman[243302]: 2026-01-22 23:03:19.037125505 +0000 UTC m=+0.057677932 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:03:19 np0005592767 podman[243300]: 2026-01-22 23:03:19.065879918 +0000 UTC m=+0.087245988 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 18:03:19 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 18:03:19 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 18:03:19 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 18:03:22 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 18:03:22 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 18:03:23 np0005592767 nova_compute[182623]: 2026-01-22 23:03:23.158 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:23 np0005592767 nova_compute[182623]: 2026-01-22 23:03:23.863 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:24 np0005592767 podman[244039]: 2026-01-22 23:03:24.798060507 +0000 UTC m=+0.064062343 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:03:28 np0005592767 nova_compute[182623]: 2026-01-22 23:03:28.185 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:28 np0005592767 nova_compute[182623]: 2026-01-22 23:03:28.865 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:30 np0005592767 ovs-appctl[245150]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:03:30 np0005592767 ovs-appctl[245158]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:03:30 np0005592767 ovs-appctl[245163]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 18:03:33 np0005592767 nova_compute[182623]: 2026-01-22 23:03:33.204 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:33 np0005592767 nova_compute[182623]: 2026-01-22 23:03:33.866 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:36 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 18:03:37 np0005592767 podman[246372]: 2026-01-22 23:03:37.236560917 +0000 UTC m=+0.061397218 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 18:03:38 np0005592767 nova_compute[182623]: 2026-01-22 23:03:38.206 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:38 np0005592767 nova_compute[182623]: 2026-01-22 23:03:38.869 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:39 np0005592767 systemd[1]: Starting Time & Date Service...
Jan 22 18:03:39 np0005592767 systemd[1]: Started Time & Date Service.
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.956 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.956 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.956 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:03:42 np0005592767 nova_compute[182623]: 2026-01-22 23:03:42.957 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.088 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.089 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5497MB free_disk=72.48704147338867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.090 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.090 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.160 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.161 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.182 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.197 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.249 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.249 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.249 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:43 np0005592767 nova_compute[182623]: 2026-01-22 23:03:43.871 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:44 np0005592767 podman[246552]: 2026-01-22 23:03:44.167858999 +0000 UTC m=+0.077854633 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git)
Jan 22 18:03:44 np0005592767 podman[246551]: 2026-01-22 23:03:44.198840526 +0000 UTC m=+0.108797518 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 22 18:03:44 np0005592767 nova_compute[182623]: 2026-01-22 23:03:44.233 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:47 np0005592767 nova_compute[182623]: 2026-01-22 23:03:47.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:48 np0005592767 nova_compute[182623]: 2026-01-22 23:03:48.251 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:48 np0005592767 nova_compute[182623]: 2026-01-22 23:03:48.874 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:49 np0005592767 podman[246600]: 2026-01-22 23:03:49.184376207 +0000 UTC m=+0.093729152 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 18:03:49 np0005592767 podman[246599]: 2026-01-22 23:03:49.198170727 +0000 UTC m=+0.110871396 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:03:50 np0005592767 nova_compute[182623]: 2026-01-22 23:03:50.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:50 np0005592767 nova_compute[182623]: 2026-01-22 23:03:50.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:53 np0005592767 nova_compute[182623]: 2026-01-22 23:03:53.254 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:53 np0005592767 nova_compute[182623]: 2026-01-22 23:03:53.877 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:53 np0005592767 nova_compute[182623]: 2026-01-22 23:03:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:53 np0005592767 nova_compute[182623]: 2026-01-22 23:03:53.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:03:54 np0005592767 nova_compute[182623]: 2026-01-22 23:03:54.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:03:55 np0005592767 podman[246639]: 2026-01-22 23:03:55.140005116 +0000 UTC m=+0.049709877 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:03:58 np0005592767 nova_compute[182623]: 2026-01-22 23:03:58.256 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:03:58 np0005592767 nova_compute[182623]: 2026-01-22 23:03:58.878 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:03 np0005592767 nova_compute[182623]: 2026-01-22 23:04:03.302 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:03 np0005592767 systemd[1]: session-71.scope: Deactivated successfully.
Jan 22 18:04:03 np0005592767 systemd[1]: session-71.scope: Consumed 1min 21.167s CPU time, 724.7M memory peak, read 261.1M from disk, written 20.9M to disk.
Jan 22 18:04:03 np0005592767 systemd-logind[802]: Session 71 logged out. Waiting for processes to exit.
Jan 22 18:04:03 np0005592767 systemd-logind[802]: Removed session 71.
Jan 22 18:04:03 np0005592767 nova_compute[182623]: 2026-01-22 23:04:03.880 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:03 np0005592767 systemd-logind[802]: New session 72 of user zuul.
Jan 22 18:04:03 np0005592767 systemd[1]: Started Session 72 of User zuul.
Jan 22 18:04:04 np0005592767 systemd[1]: session-72.scope: Deactivated successfully.
Jan 22 18:04:04 np0005592767 systemd-logind[802]: Session 72 logged out. Waiting for processes to exit.
Jan 22 18:04:04 np0005592767 systemd-logind[802]: Removed session 72.
Jan 22 18:04:04 np0005592767 systemd-logind[802]: New session 73 of user zuul.
Jan 22 18:04:04 np0005592767 systemd[1]: Started Session 73 of User zuul.
Jan 22 18:04:04 np0005592767 systemd[1]: session-73.scope: Deactivated successfully.
Jan 22 18:04:04 np0005592767 systemd-logind[802]: Session 73 logged out. Waiting for processes to exit.
Jan 22 18:04:04 np0005592767 systemd-logind[802]: Removed session 73.
Jan 22 18:04:04 np0005592767 nova_compute[182623]: 2026-01-22 23:04:04.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:06 np0005592767 nova_compute[182623]: 2026-01-22 23:04:06.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:04:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:04:08 np0005592767 podman[246721]: 2026-01-22 23:04:08.1850573 +0000 UTC m=+0.095982396 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:04:08 np0005592767 nova_compute[182623]: 2026-01-22 23:04:08.303 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:08 np0005592767 nova_compute[182623]: 2026-01-22 23:04:08.882 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:09 np0005592767 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 18:04:09 np0005592767 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 18:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:04:12.137 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:04:12.140 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:04:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:04:12.140 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:04:13 np0005592767 nova_compute[182623]: 2026-01-22 23:04:13.306 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:13 np0005592767 nova_compute[182623]: 2026-01-22 23:04:13.883 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:15 np0005592767 podman[246746]: 2026-01-22 23:04:15.161706546 +0000 UTC m=+0.076911637 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 18:04:15 np0005592767 podman[246745]: 2026-01-22 23:04:15.19475111 +0000 UTC m=+0.111468213 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:04:18 np0005592767 nova_compute[182623]: 2026-01-22 23:04:18.345 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:18 np0005592767 nova_compute[182623]: 2026-01-22 23:04:18.885 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:20 np0005592767 podman[246793]: 2026-01-22 23:04:20.133008365 +0000 UTC m=+0.055122870 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:04:20 np0005592767 podman[246794]: 2026-01-22 23:04:20.151007414 +0000 UTC m=+0.068761396 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:04:23 np0005592767 nova_compute[182623]: 2026-01-22 23:04:23.347 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:23 np0005592767 nova_compute[182623]: 2026-01-22 23:04:23.888 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:26 np0005592767 podman[246834]: 2026-01-22 23:04:26.161437282 +0000 UTC m=+0.064721072 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 18:04:28 np0005592767 nova_compute[182623]: 2026-01-22 23:04:28.349 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:28 np0005592767 nova_compute[182623]: 2026-01-22 23:04:28.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:33 np0005592767 nova_compute[182623]: 2026-01-22 23:04:33.352 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:33 np0005592767 nova_compute[182623]: 2026-01-22 23:04:33.892 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:38 np0005592767 nova_compute[182623]: 2026-01-22 23:04:38.355 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:38 np0005592767 nova_compute[182623]: 2026-01-22 23:04:38.894 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:39 np0005592767 podman[246858]: 2026-01-22 23:04:39.161923694 +0000 UTC m=+0.073288514 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 18:04:42 np0005592767 nova_compute[182623]: 2026-01-22 23:04:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:42 np0005592767 nova_compute[182623]: 2026-01-22 23:04:42.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:04:42 np0005592767 nova_compute[182623]: 2026-01-22 23:04:42.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:04:42 np0005592767 nova_compute[182623]: 2026-01-22 23:04:42.936 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:04:42 np0005592767 nova_compute[182623]: 2026-01-22 23:04:42.937 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.391 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.897 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.911 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.944 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:04:43 np0005592767 nova_compute[182623]: 2026-01-22 23:04:43.944 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.170 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.171 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5676MB free_disk=73.05129623413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.171 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.171 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.247 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.248 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.303 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.320 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.356 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:04:44 np0005592767 nova_compute[182623]: 2026-01-22 23:04:44.356 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:04:45 np0005592767 nova_compute[182623]: 2026-01-22 23:04:45.342 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:46 np0005592767 podman[246878]: 2026-01-22 23:04:46.172773665 +0000 UTC m=+0.092298411 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 22 18:04:46 np0005592767 podman[246879]: 2026-01-22 23:04:46.189226291 +0000 UTC m=+0.094561916 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git)
Jan 22 18:04:47 np0005592767 nova_compute[182623]: 2026-01-22 23:04:47.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:48 np0005592767 nova_compute[182623]: 2026-01-22 23:04:48.423 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:48 np0005592767 nova_compute[182623]: 2026-01-22 23:04:48.899 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:51 np0005592767 podman[246926]: 2026-01-22 23:04:51.166610853 +0000 UTC m=+0.075353383 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 18:04:51 np0005592767 podman[246927]: 2026-01-22 23:04:51.166618753 +0000 UTC m=+0.070442213 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:04:51 np0005592767 nova_compute[182623]: 2026-01-22 23:04:51.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:51 np0005592767 nova_compute[182623]: 2026-01-22 23:04:51.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:53 np0005592767 nova_compute[182623]: 2026-01-22 23:04:53.426 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:53 np0005592767 nova_compute[182623]: 2026-01-22 23:04:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:53 np0005592767 nova_compute[182623]: 2026-01-22 23:04:53.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:04:53 np0005592767 nova_compute[182623]: 2026-01-22 23:04:53.956 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:55 np0005592767 nova_compute[182623]: 2026-01-22 23:04:55.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:04:57 np0005592767 podman[246970]: 2026-01-22 23:04:57.187466556 +0000 UTC m=+0.104650191 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:04:58 np0005592767 nova_compute[182623]: 2026-01-22 23:04:58.464 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:04:58 np0005592767 nova_compute[182623]: 2026-01-22 23:04:58.959 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:03 np0005592767 nova_compute[182623]: 2026-01-22 23:05:03.467 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:03 np0005592767 nova_compute[182623]: 2026-01-22 23:05:03.960 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:06 np0005592767 nova_compute[182623]: 2026-01-22 23:05:06.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:08 np0005592767 nova_compute[182623]: 2026-01-22 23:05:08.469 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:08 np0005592767 nova_compute[182623]: 2026-01-22 23:05:08.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:08 np0005592767 nova_compute[182623]: 2026-01-22 23:05:08.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 18:05:08 np0005592767 nova_compute[182623]: 2026-01-22 23:05:08.963 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:10 np0005592767 podman[246999]: 2026-01-22 23:05:10.151239409 +0000 UTC m=+0.065118272 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 18:05:10 np0005592767 nova_compute[182623]: 2026-01-22 23:05:10.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:10 np0005592767 nova_compute[182623]: 2026-01-22 23:05:10.915 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 18:05:10 np0005592767 nova_compute[182623]: 2026-01-22 23:05:10.944 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 18:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:05:12.139 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:05:12.139 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:05:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:05:12.139 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:05:13 np0005592767 nova_compute[182623]: 2026-01-22 23:05:13.471 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:13 np0005592767 nova_compute[182623]: 2026-01-22 23:05:13.966 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:17 np0005592767 podman[247020]: 2026-01-22 23:05:17.147047035 +0000 UTC m=+0.067498189 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64)
Jan 22 18:05:17 np0005592767 podman[247019]: 2026-01-22 23:05:17.214001439 +0000 UTC m=+0.128512396 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 18:05:18 np0005592767 nova_compute[182623]: 2026-01-22 23:05:18.515 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:18 np0005592767 nova_compute[182623]: 2026-01-22 23:05:18.967 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:22 np0005592767 podman[247068]: 2026-01-22 23:05:22.141050177 +0000 UTC m=+0.059797272 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:05:22 np0005592767 podman[247067]: 2026-01-22 23:05:22.161108624 +0000 UTC m=+0.075348352 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 18:05:23 np0005592767 nova_compute[182623]: 2026-01-22 23:05:23.518 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:23 np0005592767 nova_compute[182623]: 2026-01-22 23:05:23.968 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:28 np0005592767 podman[247111]: 2026-01-22 23:05:28.163421584 +0000 UTC m=+0.078153922 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:05:28 np0005592767 nova_compute[182623]: 2026-01-22 23:05:28.520 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:28 np0005592767 nova_compute[182623]: 2026-01-22 23:05:28.970 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:33 np0005592767 nova_compute[182623]: 2026-01-22 23:05:33.549 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:33 np0005592767 nova_compute[182623]: 2026-01-22 23:05:33.971 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:38 np0005592767 nova_compute[182623]: 2026-01-22 23:05:38.591 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:38 np0005592767 nova_compute[182623]: 2026-01-22 23:05:38.976 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:41 np0005592767 podman[247136]: 2026-01-22 23:05:41.169065583 +0000 UTC m=+0.075190467 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:05:42 np0005592767 nova_compute[182623]: 2026-01-22 23:05:42.926 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:42 np0005592767 nova_compute[182623]: 2026-01-22 23:05:42.926 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:05:42 np0005592767 nova_compute[182623]: 2026-01-22 23:05:42.927 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:05:42 np0005592767 nova_compute[182623]: 2026-01-22 23:05:42.945 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.629 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.929 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.930 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.930 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.930 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:05:43 np0005592767 nova_compute[182623]: 2026-01-22 23:05:43.977 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.073 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.074 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5695MB free_disk=73.05129623413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.075 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.075 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.279 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.280 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.319 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.341 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.343 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:05:44 np0005592767 nova_compute[182623]: 2026-01-22 23:05:44.343 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:05:47 np0005592767 nova_compute[182623]: 2026-01-22 23:05:47.346 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:47 np0005592767 nova_compute[182623]: 2026-01-22 23:05:47.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:48 np0005592767 podman[247157]: 2026-01-22 23:05:48.16949779 +0000 UTC m=+0.077818292 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Jan 22 18:05:48 np0005592767 podman[247156]: 2026-01-22 23:05:48.199595611 +0000 UTC m=+0.113634804 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 22 18:05:48 np0005592767 nova_compute[182623]: 2026-01-22 23:05:48.632 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:48 np0005592767 nova_compute[182623]: 2026-01-22 23:05:48.979 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:52 np0005592767 nova_compute[182623]: 2026-01-22 23:05:52.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:53 np0005592767 podman[247203]: 2026-01-22 23:05:53.122887964 +0000 UTC m=+0.042952084 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:05:53 np0005592767 podman[247202]: 2026-01-22 23:05:53.12594309 +0000 UTC m=+0.049041966 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 18:05:53 np0005592767 nova_compute[182623]: 2026-01-22 23:05:53.634 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:53 np0005592767 nova_compute[182623]: 2026-01-22 23:05:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:53 np0005592767 nova_compute[182623]: 2026-01-22 23:05:53.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:53 np0005592767 nova_compute[182623]: 2026-01-22 23:05:53.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:05:53 np0005592767 nova_compute[182623]: 2026-01-22 23:05:53.982 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:57 np0005592767 nova_compute[182623]: 2026-01-22 23:05:57.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:05:58 np0005592767 nova_compute[182623]: 2026-01-22 23:05:58.637 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:59 np0005592767 nova_compute[182623]: 2026-01-22 23:05:59.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:05:59 np0005592767 podman[247242]: 2026-01-22 23:05:59.110140768 +0000 UTC m=+0.056591732 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:06:03 np0005592767 nova_compute[182623]: 2026-01-22 23:06:03.640 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:04 np0005592767 nova_compute[182623]: 2026-01-22 23:06:04.028 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:06 np0005592767 nova_compute[182623]: 2026-01-22 23:06:06.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:06:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:06:08 np0005592767 nova_compute[182623]: 2026-01-22 23:06:08.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:09 np0005592767 nova_compute[182623]: 2026-01-22 23:06:09.030 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:11 np0005592767 nova_compute[182623]: 2026-01-22 23:06:11.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:12 np0005592767 podman[247267]: 2026-01-22 23:06:12.131120761 +0000 UTC m=+0.056233032 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:06:12.140 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:06:12.141 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:06:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:06:12.141 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:06:13 np0005592767 nova_compute[182623]: 2026-01-22 23:06:13.687 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:14 np0005592767 nova_compute[182623]: 2026-01-22 23:06:14.031 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:18 np0005592767 nova_compute[182623]: 2026-01-22 23:06:18.732 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:19 np0005592767 nova_compute[182623]: 2026-01-22 23:06:19.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:19 np0005592767 podman[247288]: 2026-01-22 23:06:19.15390857 +0000 UTC m=+0.056754546 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc.)
Jan 22 18:06:19 np0005592767 podman[247287]: 2026-01-22 23:06:19.178131245 +0000 UTC m=+0.086001873 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:06:23 np0005592767 nova_compute[182623]: 2026-01-22 23:06:23.782 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:24 np0005592767 nova_compute[182623]: 2026-01-22 23:06:24.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:24 np0005592767 podman[247332]: 2026-01-22 23:06:24.121179146 +0000 UTC m=+0.043359817 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:06:24 np0005592767 podman[247331]: 2026-01-22 23:06:24.127114224 +0000 UTC m=+0.049753638 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:06:28 np0005592767 nova_compute[182623]: 2026-01-22 23:06:28.784 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:29 np0005592767 nova_compute[182623]: 2026-01-22 23:06:29.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:30 np0005592767 podman[247372]: 2026-01-22 23:06:30.151906989 +0000 UTC m=+0.066066419 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:06:33 np0005592767 nova_compute[182623]: 2026-01-22 23:06:33.829 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:34 np0005592767 nova_compute[182623]: 2026-01-22 23:06:34.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:38 np0005592767 nova_compute[182623]: 2026-01-22 23:06:38.831 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:39 np0005592767 nova_compute[182623]: 2026-01-22 23:06:39.037 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:42 np0005592767 nova_compute[182623]: 2026-01-22 23:06:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:42 np0005592767 nova_compute[182623]: 2026-01-22 23:06:42.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:06:42 np0005592767 nova_compute[182623]: 2026-01-22 23:06:42.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:06:42 np0005592767 nova_compute[182623]: 2026-01-22 23:06:42.913 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:06:43 np0005592767 podman[247396]: 2026-01-22 23:06:43.135548918 +0000 UTC m=+0.058712712 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:06:43 np0005592767 nova_compute[182623]: 2026-01-22 23:06:43.866 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.039 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.942 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.943 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:06:44 np0005592767 nova_compute[182623]: 2026-01-22 23:06:44.943 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.089 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.090 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5690MB free_disk=73.05134963989258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.090 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.091 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.221 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.221 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.236 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.359 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.360 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.381 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.408 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.464 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.481 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.483 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:06:45 np0005592767 nova_compute[182623]: 2026-01-22 23:06:45.484 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:06:47 np0005592767 nova_compute[182623]: 2026-01-22 23:06:47.484 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:48 np0005592767 nova_compute[182623]: 2026-01-22 23:06:48.868 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:48 np0005592767 nova_compute[182623]: 2026-01-22 23:06:48.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:49 np0005592767 nova_compute[182623]: 2026-01-22 23:06:49.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:50 np0005592767 podman[247417]: 2026-01-22 23:06:50.138176718 +0000 UTC m=+0.057679032 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Jan 22 18:06:50 np0005592767 podman[247416]: 2026-01-22 23:06:50.222761881 +0000 UTC m=+0.134299810 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 18:06:53 np0005592767 nova_compute[182623]: 2026-01-22 23:06:53.872 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:53 np0005592767 nova_compute[182623]: 2026-01-22 23:06:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:53 np0005592767 nova_compute[182623]: 2026-01-22 23:06:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:54 np0005592767 nova_compute[182623]: 2026-01-22 23:06:54.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:54 np0005592767 nova_compute[182623]: 2026-01-22 23:06:54.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:06:54 np0005592767 nova_compute[182623]: 2026-01-22 23:06:54.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:06:55 np0005592767 podman[247464]: 2026-01-22 23:06:55.114274833 +0000 UTC m=+0.038722836 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 18:06:55 np0005592767 podman[247465]: 2026-01-22 23:06:55.139917158 +0000 UTC m=+0.051463227 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:06:58 np0005592767 nova_compute[182623]: 2026-01-22 23:06:58.874 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:59 np0005592767 nova_compute[182623]: 2026-01-22 23:06:59.044 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:06:59 np0005592767 nova_compute[182623]: 2026-01-22 23:06:59.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:01 np0005592767 podman[247508]: 2026-01-22 23:07:01.169448817 +0000 UTC m=+0.075524867 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:07:03 np0005592767 nova_compute[182623]: 2026-01-22 23:07:03.877 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:04 np0005592767 nova_compute[182623]: 2026-01-22 23:07:04.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:08 np0005592767 nova_compute[182623]: 2026-01-22 23:07:08.879 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:08 np0005592767 nova_compute[182623]: 2026-01-22 23:07:08.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:09 np0005592767 nova_compute[182623]: 2026-01-22 23:07:09.049 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:12.141 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:12.141 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:07:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:12.142 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:07:13 np0005592767 nova_compute[182623]: 2026-01-22 23:07:13.921 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:14 np0005592767 nova_compute[182623]: 2026-01-22 23:07:14.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:14 np0005592767 podman[247533]: 2026-01-22 23:07:14.172266556 +0000 UTC m=+0.078934013 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:07:18 np0005592767 nova_compute[182623]: 2026-01-22 23:07:18.923 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:19 np0005592767 nova_compute[182623]: 2026-01-22 23:07:19.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:21 np0005592767 podman[247555]: 2026-01-22 23:07:21.153955773 +0000 UTC m=+0.071232175 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 18:07:21 np0005592767 podman[247554]: 2026-01-22 23:07:21.17575814 +0000 UTC m=+0.099619789 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 18:07:23 np0005592767 nova_compute[182623]: 2026-01-22 23:07:23.926 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:24 np0005592767 nova_compute[182623]: 2026-01-22 23:07:24.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:26 np0005592767 podman[247605]: 2026-01-22 23:07:26.131333374 +0000 UTC m=+0.053946926 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:07:26 np0005592767 podman[247604]: 2026-01-22 23:07:26.159782549 +0000 UTC m=+0.083261366 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 18:07:28 np0005592767 nova_compute[182623]: 2026-01-22 23:07:28.930 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:29 np0005592767 nova_compute[182623]: 2026-01-22 23:07:29.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:32 np0005592767 podman[247647]: 2026-01-22 23:07:32.131311677 +0000 UTC m=+0.048628527 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:07:33 np0005592767 nova_compute[182623]: 2026-01-22 23:07:33.932 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:34 np0005592767 nova_compute[182623]: 2026-01-22 23:07:34.059 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:38 np0005592767 nova_compute[182623]: 2026-01-22 23:07:38.934 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:39 np0005592767 nova_compute[182623]: 2026-01-22 23:07:39.062 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:41 np0005592767 nova_compute[182623]: 2026-01-22 23:07:41.381 182627 DEBUG oslo_concurrency.processutils [None req-99e82455-173d-451a-9c58-52a7cd7c6520 c792c8e8aa0d49e0a31a292ed9d309f4 6912a9182ac44bb486092f7ccd64d58c - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 22 18:07:41 np0005592767 nova_compute[182623]: 2026-01-22 23:07:41.406 182627 DEBUG oslo_concurrency.processutils [None req-99e82455-173d-451a-9c58-52a7cd7c6520 c792c8e8aa0d49e0a31a292ed9d309f4 6912a9182ac44bb486092f7ccd64d58c - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 22 18:07:43 np0005592767 nova_compute[182623]: 2026-01-22 23:07:43.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:43 np0005592767 nova_compute[182623]: 2026-01-22 23:07:43.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:07:43 np0005592767 nova_compute[182623]: 2026-01-22 23:07:43.900 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:07:43 np0005592767 nova_compute[182623]: 2026-01-22 23:07:43.935 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:07:43 np0005592767 nova_compute[182623]: 2026-01-22 23:07:43.937 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:44 np0005592767 nova_compute[182623]: 2026-01-22 23:07:44.065 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:45 np0005592767 podman[247673]: 2026-01-22 23:07:45.146036712 +0000 UTC m=+0.064097973 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 18:07:46 np0005592767 nova_compute[182623]: 2026-01-22 23:07:46.909 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:46 np0005592767 nova_compute[182623]: 2026-01-22 23:07:46.948 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:07:46 np0005592767 nova_compute[182623]: 2026-01-22 23:07:46.949 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:07:46 np0005592767 nova_compute[182623]: 2026-01-22 23:07:46.949 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:07:46 np0005592767 nova_compute[182623]: 2026-01-22 23:07:46.949 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.150 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.151 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5691MB free_disk=73.05134963989258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.151 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.151 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.646 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.646 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:07:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:47.845 104135 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f2:6e:0f', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ce:5d:96:80:79:b3'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 22 18:07:47 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:47.847 104135 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 22 18:07:47 np0005592767 nova_compute[182623]: 2026-01-22 23:07:47.887 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:48 np0005592767 nova_compute[182623]: 2026-01-22 23:07:48.134 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:07:48 np0005592767 nova_compute[182623]: 2026-01-22 23:07:48.167 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:07:48 np0005592767 nova_compute[182623]: 2026-01-22 23:07:48.169 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:07:48 np0005592767 nova_compute[182623]: 2026-01-22 23:07:48.169 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:07:48 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:07:48.852 104135 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e130c2ec-fef7-4ed2-892d-1e3d7eaab401, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 22 18:07:48 np0005592767 nova_compute[182623]: 2026-01-22 23:07:48.974 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:49 np0005592767 nova_compute[182623]: 2026-01-22 23:07:49.068 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:49 np0005592767 nova_compute[182623]: 2026-01-22 23:07:49.158 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:49 np0005592767 nova_compute[182623]: 2026-01-22 23:07:49.900 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:52 np0005592767 podman[247693]: 2026-01-22 23:07:52.149666951 +0000 UTC m=+0.072217283 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 18:07:52 np0005592767 podman[247694]: 2026-01-22 23:07:52.149681852 +0000 UTC m=+0.060589425 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, distribution-scope=public, release=1755695350, config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Jan 22 18:07:53 np0005592767 nova_compute[182623]: 2026-01-22 23:07:53.998 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:54 np0005592767 nova_compute[182623]: 2026-01-22 23:07:54.070 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:54 np0005592767 nova_compute[182623]: 2026-01-22 23:07:54.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:55 np0005592767 nova_compute[182623]: 2026-01-22 23:07:55.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:55 np0005592767 nova_compute[182623]: 2026-01-22 23:07:55.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:07:55 np0005592767 nova_compute[182623]: 2026-01-22 23:07:55.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:07:57 np0005592767 podman[247739]: 2026-01-22 23:07:57.13827172 +0000 UTC m=+0.052613269 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:07:57 np0005592767 podman[247738]: 2026-01-22 23:07:57.147153161 +0000 UTC m=+0.063590129 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:07:59 np0005592767 nova_compute[182623]: 2026-01-22 23:07:59.000 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:59 np0005592767 nova_compute[182623]: 2026-01-22 23:07:59.071 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:07:59 np0005592767 nova_compute[182623]: 2026-01-22 23:07:59.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:03 np0005592767 podman[247778]: 2026-01-22 23:08:03.124576185 +0000 UTC m=+0.049069337 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:08:04 np0005592767 nova_compute[182623]: 2026-01-22 23:08:04.002 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:04 np0005592767 nova_compute[182623]: 2026-01-22 23:08:04.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:08:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:08:09 np0005592767 nova_compute[182623]: 2026-01-22 23:08:09.048 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:09 np0005592767 nova_compute[182623]: 2026-01-22 23:08:09.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:09 np0005592767 nova_compute[182623]: 2026-01-22 23:08:09.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:08:12.142 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:08:12.143 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:08:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:08:12.143 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:08:14 np0005592767 nova_compute[182623]: 2026-01-22 23:08:14.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:14 np0005592767 nova_compute[182623]: 2026-01-22 23:08:14.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:14 np0005592767 nova_compute[182623]: 2026-01-22 23:08:14.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:16 np0005592767 podman[247801]: 2026-01-22 23:08:16.131035526 +0000 UTC m=+0.056550341 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:08:19 np0005592767 nova_compute[182623]: 2026-01-22 23:08:19.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:19 np0005592767 nova_compute[182623]: 2026-01-22 23:08:19.081 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:23 np0005592767 podman[247820]: 2026-01-22 23:08:23.206476716 +0000 UTC m=+0.116438784 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 18:08:23 np0005592767 podman[247821]: 2026-01-22 23:08:23.212449105 +0000 UTC m=+0.114484179 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6)
Jan 22 18:08:24 np0005592767 nova_compute[182623]: 2026-01-22 23:08:24.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:24 np0005592767 nova_compute[182623]: 2026-01-22 23:08:24.084 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:28 np0005592767 podman[247866]: 2026-01-22 23:08:28.120974358 +0000 UTC m=+0.045220660 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 18:08:28 np0005592767 podman[247867]: 2026-01-22 23:08:28.146026476 +0000 UTC m=+0.062052546 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:08:29 np0005592767 nova_compute[182623]: 2026-01-22 23:08:29.058 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:29 np0005592767 nova_compute[182623]: 2026-01-22 23:08:29.085 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.087 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.088 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.089 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.089 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.112 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:34 np0005592767 nova_compute[182623]: 2026-01-22 23:08:34.114 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:08:34 np0005592767 podman[247910]: 2026-01-22 23:08:34.189030665 +0000 UTC m=+0.053160874 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:08:39 np0005592767 nova_compute[182623]: 2026-01-22 23:08:39.115 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:43 np0005592767 nova_compute[182623]: 2026-01-22 23:08:43.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:43 np0005592767 nova_compute[182623]: 2026-01-22 23:08:43.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:08:43 np0005592767 nova_compute[182623]: 2026-01-22 23:08:43.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:08:44 np0005592767 nova_compute[182623]: 2026-01-22 23:08:44.071 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:08:44 np0005592767 nova_compute[182623]: 2026-01-22 23:08:44.116 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:47 np0005592767 podman[247938]: 2026-01-22 23:08:47.1496628 +0000 UTC m=+0.071224245 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 18:08:48 np0005592767 nova_compute[182623]: 2026-01-22 23:08:48.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:48 np0005592767 nova_compute[182623]: 2026-01-22 23:08:48.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.118 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.119 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:08:49 np0005592767 nova_compute[182623]: 2026-01-22 23:08:49.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.038 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.038 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.039 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.039 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.181 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.182 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5704MB free_disk=73.05134582519531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.183 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:08:52 np0005592767 nova_compute[182623]: 2026-01-22 23:08:52.183 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:08:54 np0005592767 nova_compute[182623]: 2026-01-22 23:08:54.120 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:54 np0005592767 podman[247957]: 2026-01-22 23:08:54.152128416 +0000 UTC m=+0.072370838 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:08:54 np0005592767 podman[247958]: 2026-01-22 23:08:54.16465356 +0000 UTC m=+0.078337256 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.608 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.609 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.642 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.674 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.676 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:08:55 np0005592767 nova_compute[182623]: 2026-01-22 23:08:55.676 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:08:57 np0005592767 nova_compute[182623]: 2026-01-22 23:08:57.676 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:57 np0005592767 nova_compute[182623]: 2026-01-22 23:08:57.677 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:57 np0005592767 nova_compute[182623]: 2026-01-22 23:08:57.677 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:57 np0005592767 nova_compute[182623]: 2026-01-22 23:08:57.677 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:57 np0005592767 nova_compute[182623]: 2026-01-22 23:08:57.678 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:08:58 np0005592767 nova_compute[182623]: 2026-01-22 23:08:58.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:08:59 np0005592767 podman[248003]: 2026-01-22 23:08:59.009660168 +0000 UTC m=+0.073112359 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 18:08:59 np0005592767 podman[248004]: 2026-01-22 23:08:59.01857495 +0000 UTC m=+0.071121552 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.122 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.123 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.124 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.162 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:08:59 np0005592767 nova_compute[182623]: 2026-01-22 23:08:59.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:01 np0005592767 nova_compute[182623]: 2026-01-22 23:09:01.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.163 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.166 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.206 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:04 np0005592767 nova_compute[182623]: 2026-01-22 23:09:04.207 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:05 np0005592767 podman[248044]: 2026-01-22 23:09:05.149404224 +0000 UTC m=+0.055399708 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.207 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.209 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.275 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:09 np0005592767 nova_compute[182623]: 2026-01-22 23:09:09.276 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:10 np0005592767 nova_compute[182623]: 2026-01-22 23:09:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:09:12.144 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:09:12.144 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:09:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:09:12.144 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:09:14 np0005592767 nova_compute[182623]: 2026-01-22 23:09:14.276 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:18 np0005592767 podman[248069]: 2026-01-22 23:09:18.157341506 +0000 UTC m=+0.073885200 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.278 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.278 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:19 np0005592767 nova_compute[182623]: 2026-01-22 23:09:19.280 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:24 np0005592767 nova_compute[182623]: 2026-01-22 23:09:24.279 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:25 np0005592767 podman[248090]: 2026-01-22 23:09:25.147148124 +0000 UTC m=+0.057597750 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=)
Jan 22 18:09:25 np0005592767 podman[248089]: 2026-01-22 23:09:25.173992283 +0000 UTC m=+0.080998912 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:09:29 np0005592767 podman[248132]: 2026-01-22 23:09:29.149391286 +0000 UTC m=+0.065039581 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:09:29 np0005592767 podman[248133]: 2026-01-22 23:09:29.177671946 +0000 UTC m=+0.076304189 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.282 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.284 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.316 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:29 np0005592767 nova_compute[182623]: 2026-01-22 23:09:29.317 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:34 np0005592767 nova_compute[182623]: 2026-01-22 23:09:34.318 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:35 np0005592767 podman[248178]: 2026-01-22 23:09:35.981654168 +0000 UTC m=+0.055816339 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:09:39 np0005592767 nova_compute[182623]: 2026-01-22 23:09:39.319 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:42 np0005592767 nova_compute[182623]: 2026-01-22 23:09:42.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:43 np0005592767 nova_compute[182623]: 2026-01-22 23:09:43.917 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:43 np0005592767 nova_compute[182623]: 2026-01-22 23:09:43.918 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:09:43 np0005592767 nova_compute[182623]: 2026-01-22 23:09:43.918 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:09:43 np0005592767 nova_compute[182623]: 2026-01-22 23:09:43.951 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:09:44 np0005592767 nova_compute[182623]: 2026-01-22 23:09:44.320 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:48 np0005592767 nova_compute[182623]: 2026-01-22 23:09:48.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:48 np0005592767 nova_compute[182623]: 2026-01-22 23:09:48.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:09:48 np0005592767 nova_compute[182623]: 2026-01-22 23:09:48.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:09:48 np0005592767 nova_compute[182623]: 2026-01-22 23:09:48.922 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:09:48 np0005592767 nova_compute[182623]: 2026-01-22 23:09:48.923 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.161 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:09:49 np0005592767 podman[248202]: 2026-01-22 23:09:49.16276761 +0000 UTC m=+0.073924762 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.164 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5690MB free_disk=73.05134582519531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.164 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.165 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.232 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.232 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.322 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.323 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.324 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.325 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.327 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.416 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.433 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.436 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:09:49 np0005592767 nova_compute[182623]: 2026-01-22 23:09:49.436 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:09:50 np0005592767 nova_compute[182623]: 2026-01-22 23:09:50.436 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:51 np0005592767 nova_compute[182623]: 2026-01-22 23:09:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:54 np0005592767 nova_compute[182623]: 2026-01-22 23:09:54.326 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:55 np0005592767 nova_compute[182623]: 2026-01-22 23:09:55.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:55 np0005592767 nova_compute[182623]: 2026-01-22 23:09:55.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:55 np0005592767 nova_compute[182623]: 2026-01-22 23:09:55.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:09:56 np0005592767 podman[248222]: 2026-01-22 23:09:56.144684173 +0000 UTC m=+0.052127575 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:09:56 np0005592767 podman[248221]: 2026-01-22 23:09:56.173967262 +0000 UTC m=+0.083419581 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 18:09:56 np0005592767 nova_compute[182623]: 2026-01-22 23:09:56.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.328 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.330 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.331 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.331 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.355 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:09:59 np0005592767 nova_compute[182623]: 2026-01-22 23:09:59.356 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:00 np0005592767 podman[248267]: 2026-01-22 23:10:00.147824212 +0000 UTC m=+0.060257475 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:10:00 np0005592767 podman[248266]: 2026-01-22 23:10:00.149008115 +0000 UTC m=+0.064310280 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:10:03 np0005592767 nova_compute[182623]: 2026-01-22 23:10:03.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.357 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.359 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.383 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:04 np0005592767 nova_compute[182623]: 2026-01-22 23:10:04.383 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:06 np0005592767 podman[248309]: 2026-01-22 23:10:06.142134516 +0000 UTC m=+0.054880233 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:10:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.384 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.385 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.385 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.385 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.386 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:09 np0005592767 nova_compute[182623]: 2026-01-22 23:10:09.388 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:10 np0005592767 nova_compute[182623]: 2026-01-22 23:10:10.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:10:12.146 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:10:12.147 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:10:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:10:12.147 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:10:14 np0005592767 nova_compute[182623]: 2026-01-22 23:10:14.385 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:14 np0005592767 nova_compute[182623]: 2026-01-22 23:10:14.387 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:17 np0005592767 nova_compute[182623]: 2026-01-22 23:10:17.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:17 np0005592767 nova_compute[182623]: 2026-01-22 23:10:17.908 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:17 np0005592767 nova_compute[182623]: 2026-01-22 23:10:17.908 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 18:10:17 np0005592767 nova_compute[182623]: 2026-01-22 23:10:17.922 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.388 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.389 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.390 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.390 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.424 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:19 np0005592767 nova_compute[182623]: 2026-01-22 23:10:19.425 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:20 np0005592767 podman[248334]: 2026-01-22 23:10:20.126152394 +0000 UTC m=+0.052104024 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:10:21 np0005592767 nova_compute[182623]: 2026-01-22 23:10:21.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:21 np0005592767 nova_compute[182623]: 2026-01-22 23:10:21.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 18:10:24 np0005592767 nova_compute[182623]: 2026-01-22 23:10:24.425 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:24 np0005592767 nova_compute[182623]: 2026-01-22 23:10:24.426 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:27 np0005592767 podman[248357]: 2026-01-22 23:10:27.165308376 +0000 UTC m=+0.074088246 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 18:10:27 np0005592767 podman[248356]: 2026-01-22 23:10:27.173617821 +0000 UTC m=+0.092829366 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 22 18:10:29 np0005592767 nova_compute[182623]: 2026-01-22 23:10:29.427 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:31 np0005592767 podman[248406]: 2026-01-22 23:10:31.13399593 +0000 UTC m=+0.056515189 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 18:10:31 np0005592767 podman[248407]: 2026-01-22 23:10:31.142983454 +0000 UTC m=+0.060572043 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:10:34 np0005592767 nova_compute[182623]: 2026-01-22 23:10:34.428 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:34 np0005592767 nova_compute[182623]: 2026-01-22 23:10:34.429 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:37 np0005592767 podman[248450]: 2026-01-22 23:10:37.122981623 +0000 UTC m=+0.047960567 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:10:39 np0005592767 nova_compute[182623]: 2026-01-22 23:10:39.430 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:44 np0005592767 nova_compute[182623]: 2026-01-22 23:10:44.431 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:48 np0005592767 nova_compute[182623]: 2026-01-22 23:10:48.347 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:48 np0005592767 nova_compute[182623]: 2026-01-22 23:10:48.347 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:10:48 np0005592767 nova_compute[182623]: 2026-01-22 23:10:48.348 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:10:48 np0005592767 nova_compute[182623]: 2026-01-22 23:10:48.368 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.433 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.926 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:10:49 np0005592767 nova_compute[182623]: 2026-01-22 23:10:49.926 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.183 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.184 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5698MB free_disk=73.05134582519531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.185 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.185 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.270 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.270 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.294 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.313 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.314 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:10:50 np0005592767 nova_compute[182623]: 2026-01-22 23:10:50.314 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:10:51 np0005592767 podman[198494]: time="2026-01-22T23:10:51Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 18:10:51 np0005592767 podman[198494]: @ - - [22/Jan/2026:23:10:51 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 21520 "" "Go-http-client/1.1"
Jan 22 18:10:51 np0005592767 podman[248476]: 2026-01-22 23:10:51.16935611 +0000 UTC m=+0.087911237 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 18:10:51 np0005592767 nova_compute[182623]: 2026-01-22 23:10:51.315 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:52 np0005592767 nova_compute[182623]: 2026-01-22 23:10:52.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.435 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.436 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.436 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.437 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:10:54 np0005592767 nova_compute[182623]: 2026-01-22 23:10:54.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:10:55 np0005592767 nova_compute[182623]: 2026-01-22 23:10:55.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:55 np0005592767 nova_compute[182623]: 2026-01-22 23:10:55.896 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:10:57 np0005592767 nova_compute[182623]: 2026-01-22 23:10:57.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:57 np0005592767 nova_compute[182623]: 2026-01-22 23:10:57.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:58 np0005592767 podman[248499]: 2026-01-22 23:10:58.143326437 +0000 UTC m=+0.057273320 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 22 18:10:58 np0005592767 podman[248498]: 2026-01-22 23:10:58.186537139 +0000 UTC m=+0.101879252 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 18:10:58 np0005592767 nova_compute[182623]: 2026-01-22 23:10:58.676 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:10:59 np0005592767 nova_compute[182623]: 2026-01-22 23:10:59.439 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:02 np0005592767 podman[248546]: 2026-01-22 23:11:02.163003692 +0000 UTC m=+0.067822218 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:11:02 np0005592767 podman[248545]: 2026-01-22 23:11:02.171098751 +0000 UTC m=+0.082463802 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:11:04 np0005592767 nova_compute[182623]: 2026-01-22 23:11:04.443 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:11:04 np0005592767 nova_compute[182623]: 2026-01-22 23:11:04.895 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:06 np0005592767 nova_compute[182623]: 2026-01-22 23:11:06.645 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:08 np0005592767 podman[248588]: 2026-01-22 23:11:08.14248343 +0000 UTC m=+0.057062704 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 18:11:09 np0005592767 nova_compute[182623]: 2026-01-22 23:11:09.446 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:09 np0005592767 nova_compute[182623]: 2026-01-22 23:11:09.449 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:11:12.148 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:11:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:11:12.149 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:11:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:11:12.149 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:11:12 np0005592767 nova_compute[182623]: 2026-01-22 23:11:12.922 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:14 np0005592767 nova_compute[182623]: 2026-01-22 23:11:14.448 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:14 np0005592767 nova_compute[182623]: 2026-01-22 23:11:14.450 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:19 np0005592767 nova_compute[182623]: 2026-01-22 23:11:19.451 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:19 np0005592767 nova_compute[182623]: 2026-01-22 23:11:19.453 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:22 np0005592767 podman[248612]: 2026-01-22 23:11:22.137803624 +0000 UTC m=+0.063237079 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:11:24 np0005592767 nova_compute[182623]: 2026-01-22 23:11:24.453 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:24 np0005592767 nova_compute[182623]: 2026-01-22 23:11:24.454 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:29 np0005592767 podman[248633]: 2026-01-22 23:11:29.194667114 +0000 UTC m=+0.094003329 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7)
Jan 22 18:11:29 np0005592767 podman[248632]: 2026-01-22 23:11:29.218897789 +0000 UTC m=+0.124260024 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.456 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.458 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.459 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.459 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.484 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:29 np0005592767 nova_compute[182623]: 2026-01-22 23:11:29.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:11:33 np0005592767 podman[248680]: 2026-01-22 23:11:33.159370424 +0000 UTC m=+0.068463947 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:11:33 np0005592767 podman[248679]: 2026-01-22 23:11:33.162793661 +0000 UTC m=+0.068137218 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:11:34 np0005592767 nova_compute[182623]: 2026-01-22 23:11:34.485 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:34 np0005592767 nova_compute[182623]: 2026-01-22 23:11:34.486 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:39 np0005592767 podman[248722]: 2026-01-22 23:11:39.160903776 +0000 UTC m=+0.075386883 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.487 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.489 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.489 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.489 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.490 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:11:39 np0005592767 nova_compute[182623]: 2026-01-22 23:11:39.492 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:44 np0005592767 nova_compute[182623]: 2026-01-22 23:11:44.490 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:44 np0005592767 nova_compute[182623]: 2026-01-22 23:11:44.492 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:45 np0005592767 nova_compute[182623]: 2026-01-22 23:11:45.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:45 np0005592767 nova_compute[182623]: 2026-01-22 23:11:45.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:11:45 np0005592767 nova_compute[182623]: 2026-01-22 23:11:45.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:11:45 np0005592767 nova_compute[182623]: 2026-01-22 23:11:45.959 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:11:49 np0005592767 nova_compute[182623]: 2026-01-22 23:11:49.491 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:49 np0005592767 nova_compute[182623]: 2026-01-22 23:11:49.492 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:51 np0005592767 nova_compute[182623]: 2026-01-22 23:11:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:51 np0005592767 nova_compute[182623]: 2026-01-22 23:11:51.934 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:11:51 np0005592767 nova_compute[182623]: 2026-01-22 23:11:51.935 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:11:51 np0005592767 nova_compute[182623]: 2026-01-22 23:11:51.935 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:11:51 np0005592767 nova_compute[182623]: 2026-01-22 23:11:51.935 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.101 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.102 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5694MB free_disk=73.05134201049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.102 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.102 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.163 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.163 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.178 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.203 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.203 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.215 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.233 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.258 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.275 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.276 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:11:52 np0005592767 nova_compute[182623]: 2026-01-22 23:11:52.276 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:11:53 np0005592767 podman[248748]: 2026-01-22 23:11:53.166659606 +0000 UTC m=+0.090438308 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 18:11:53 np0005592767 nova_compute[182623]: 2026-01-22 23:11:53.276 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:53 np0005592767 nova_compute[182623]: 2026-01-22 23:11:53.276 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:54 np0005592767 nova_compute[182623]: 2026-01-22 23:11:54.493 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:56 np0005592767 nova_compute[182623]: 2026-01-22 23:11:56.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:56 np0005592767 nova_compute[182623]: 2026-01-22 23:11:56.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:11:59 np0005592767 nova_compute[182623]: 2026-01-22 23:11:59.495 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:11:59 np0005592767 nova_compute[182623]: 2026-01-22 23:11:59.496 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:11:59 np0005592767 podman[248769]: 2026-01-22 23:11:59.598114583 +0000 UTC m=+0.067279593 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 18:11:59 np0005592767 podman[248768]: 2026-01-22 23:11:59.613091057 +0000 UTC m=+0.084808779 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:11:59 np0005592767 nova_compute[182623]: 2026-01-22 23:11:59.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:11:59 np0005592767 nova_compute[182623]: 2026-01-22 23:11:59.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:04 np0005592767 podman[248816]: 2026-01-22 23:12:04.163730045 +0000 UTC m=+0.077697438 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:12:04 np0005592767 podman[248817]: 2026-01-22 23:12:04.169518209 +0000 UTC m=+0.071684388 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.498 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.499 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.500 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.500 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.501 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:04 np0005592767 nova_compute[182623]: 2026-01-22 23:12:04.502 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:06 np0005592767 nova_compute[182623]: 2026-01-22 23:12:06.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:12:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.503 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.504 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:09 np0005592767 nova_compute[182623]: 2026-01-22 23:12:09.506 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:10 np0005592767 podman[248860]: 2026-01-22 23:12:10.137541312 +0000 UTC m=+0.050865650 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:12:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:12:12.151 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:12:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:12:12.151 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:12:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:12:12.151 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:12:12 np0005592767 nova_compute[182623]: 2026-01-22 23:12:12.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.508 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.510 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.511 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.526 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:14 np0005592767 nova_compute[182623]: 2026-01-22 23:12:14.526 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:18 np0005592767 nova_compute[182623]: 2026-01-22 23:12:18.893 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.527 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.529 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:19 np0005592767 nova_compute[182623]: 2026-01-22 23:12:19.557 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:24 np0005592767 podman[248889]: 2026-01-22 23:12:24.175732189 +0000 UTC m=+0.086823486 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 18:12:24 np0005592767 nova_compute[182623]: 2026-01-22 23:12:24.558 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:29 np0005592767 nova_compute[182623]: 2026-01-22 23:12:29.560 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:29 np0005592767 nova_compute[182623]: 2026-01-22 23:12:29.561 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:30 np0005592767 podman[248910]: 2026-01-22 23:12:30.135313135 +0000 UTC m=+0.057510387 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Jan 22 18:12:30 np0005592767 podman[248909]: 2026-01-22 23:12:30.170599352 +0000 UTC m=+0.093318349 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.562 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.564 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.564 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.564 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:34 np0005592767 nova_compute[182623]: 2026-01-22 23:12:34.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:35 np0005592767 podman[248956]: 2026-01-22 23:12:35.188410169 +0000 UTC m=+0.096182610 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 18:12:35 np0005592767 podman[248957]: 2026-01-22 23:12:35.19092763 +0000 UTC m=+0.092498896 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.565 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.566 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.566 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.566 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.567 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:39 np0005592767 nova_compute[182623]: 2026-01-22 23:12:39.568 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:41 np0005592767 podman[248999]: 2026-01-22 23:12:41.130143821 +0000 UTC m=+0.050880730 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:12:44 np0005592767 nova_compute[182623]: 2026-01-22 23:12:44.568 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:47 np0005592767 nova_compute[182623]: 2026-01-22 23:12:47.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:47 np0005592767 nova_compute[182623]: 2026-01-22 23:12:47.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:12:47 np0005592767 nova_compute[182623]: 2026-01-22 23:12:47.899 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:12:47 np0005592767 nova_compute[182623]: 2026-01-22 23:12:47.914 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.570 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.572 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.573 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.573 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.598 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:49 np0005592767 nova_compute[182623]: 2026-01-22 23:12:49.599 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.925 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:12:53 np0005592767 nova_compute[182623]: 2026-01-22 23:12:53.925 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.124 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.125 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5692MB free_disk=73.05134201049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.126 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.126 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.321 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.322 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.401 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.415 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.417 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.418 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:12:54 np0005592767 nova_compute[182623]: 2026-01-22 23:12:54.599 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:55 np0005592767 podman[249024]: 2026-01-22 23:12:55.168079759 +0000 UTC m=+0.082100213 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 18:12:55 np0005592767 nova_compute[182623]: 2026-01-22 23:12:55.418 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:57 np0005592767 nova_compute[182623]: 2026-01-22 23:12:57.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:12:57 np0005592767 nova_compute[182623]: 2026-01-22 23:12:57.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.602 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.604 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.639 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:12:59 np0005592767 nova_compute[182623]: 2026-01-22 23:12:59.640 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:00 np0005592767 nova_compute[182623]: 2026-01-22 23:13:00.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:00 np0005592767 nova_compute[182623]: 2026-01-22 23:13:00.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:01 np0005592767 podman[249045]: 2026-01-22 23:13:01.180669913 +0000 UTC m=+0.089826671 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 22 18:13:01 np0005592767 podman[249044]: 2026-01-22 23:13:01.194451793 +0000 UTC m=+0.110324841 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.641 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.643 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:04 np0005592767 nova_compute[182623]: 2026-01-22 23:13:04.683 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:06 np0005592767 podman[249089]: 2026-01-22 23:13:06.209210853 +0000 UTC m=+0.116944308 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 18:13:06 np0005592767 podman[249090]: 2026-01-22 23:13:06.224266709 +0000 UTC m=+0.133188577 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:13:08 np0005592767 nova_compute[182623]: 2026-01-22 23:13:08.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.685 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.686 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.687 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.687 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.717 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:09 np0005592767 nova_compute[182623]: 2026-01-22 23:13:09.718 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:13:12.153 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:13:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:13:12.153 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:13:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:13:12.153 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:13:12 np0005592767 podman[249134]: 2026-01-22 23:13:12.1649745 +0000 UTC m=+0.079552530 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:13:14 np0005592767 nova_compute[182623]: 2026-01-22 23:13:14.719 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:14 np0005592767 nova_compute[182623]: 2026-01-22 23:13:14.720 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:14 np0005592767 nova_compute[182623]: 2026-01-22 23:13:14.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:19 np0005592767 nova_compute[182623]: 2026-01-22 23:13:19.721 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:19 np0005592767 nova_compute[182623]: 2026-01-22 23:13:19.724 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.724 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.726 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.727 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.727 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.728 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:24 np0005592767 nova_compute[182623]: 2026-01-22 23:13:24.730 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:26 np0005592767 podman[249159]: 2026-01-22 23:13:26.145008702 +0000 UTC m=+0.067941672 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.731 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.733 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.733 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.734 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.759 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:29 np0005592767 nova_compute[182623]: 2026-01-22 23:13:29.759 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:32 np0005592767 podman[249180]: 2026-01-22 23:13:32.135039538 +0000 UTC m=+0.056096608 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 18:13:32 np0005592767 podman[249179]: 2026-01-22 23:13:32.158334176 +0000 UTC m=+0.082025420 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 18:13:34 np0005592767 nova_compute[182623]: 2026-01-22 23:13:34.760 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:34 np0005592767 nova_compute[182623]: 2026-01-22 23:13:34.761 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:37 np0005592767 podman[249230]: 2026-01-22 23:13:37.146057194 +0000 UTC m=+0.063873548 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 18:13:37 np0005592767 podman[249229]: 2026-01-22 23:13:37.163418134 +0000 UTC m=+0.087392162 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 18:13:39 np0005592767 nova_compute[182623]: 2026-01-22 23:13:39.762 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:43 np0005592767 podman[249271]: 2026-01-22 23:13:43.134214917 +0000 UTC m=+0.052362282 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:13:44 np0005592767 nova_compute[182623]: 2026-01-22 23:13:44.763 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:48 np0005592767 nova_compute[182623]: 2026-01-22 23:13:48.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:48 np0005592767 nova_compute[182623]: 2026-01-22 23:13:48.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:13:48 np0005592767 nova_compute[182623]: 2026-01-22 23:13:48.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:13:48 np0005592767 nova_compute[182623]: 2026-01-22 23:13:48.927 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:13:49 np0005592767 nova_compute[182623]: 2026-01-22 23:13:49.765 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:53 np0005592767 nova_compute[182623]: 2026-01-22 23:13:53.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:53 np0005592767 nova_compute[182623]: 2026-01-22 23:13:53.937 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:13:53 np0005592767 nova_compute[182623]: 2026-01-22 23:13:53.938 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:13:53 np0005592767 nova_compute[182623]: 2026-01-22 23:13:53.938 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:13:53 np0005592767 nova_compute[182623]: 2026-01-22 23:13:53.938 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.099 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.100 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5684MB free_disk=73.05134201049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.101 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.101 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.156 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.156 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.180 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.195 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.196 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.196 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:13:54 np0005592767 nova_compute[182623]: 2026-01-22 23:13:54.767 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:56 np0005592767 nova_compute[182623]: 2026-01-22 23:13:56.196 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:56 np0005592767 nova_compute[182623]: 2026-01-22 23:13:56.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:57 np0005592767 podman[249295]: 2026-01-22 23:13:57.162214914 +0000 UTC m=+0.070717450 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:13:57 np0005592767 nova_compute[182623]: 2026-01-22 23:13:57.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:13:57 np0005592767 nova_compute[182623]: 2026-01-22 23:13:57.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.769 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.770 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.771 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:13:59 np0005592767 nova_compute[182623]: 2026-01-22 23:13:59.772 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:00 np0005592767 nova_compute[182623]: 2026-01-22 23:14:00.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:01 np0005592767 nova_compute[182623]: 2026-01-22 23:14:01.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:03 np0005592767 podman[249315]: 2026-01-22 23:14:03.154370092 +0000 UTC m=+0.075135505 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:14:03 np0005592767 podman[249316]: 2026-01-22 23:14:03.158321144 +0000 UTC m=+0.064781113 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350)
Jan 22 18:14:04 np0005592767 nova_compute[182623]: 2026-01-22 23:14:04.774 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.340 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.341 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:14:07.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:14:07 np0005592767 podman[249367]: 2026-01-22 23:14:07.606691069 +0000 UTC m=+0.066451628 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 18:14:07 np0005592767 podman[249366]: 2026-01-22 23:14:07.630633106 +0000 UTC m=+0.083738437 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 18:14:09 np0005592767 nova_compute[182623]: 2026-01-22 23:14:09.776 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:09 np0005592767 nova_compute[182623]: 2026-01-22 23:14:09.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:14:12.154 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:14:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:14:12.155 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:14:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:14:12.155 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:14:14 np0005592767 podman[249415]: 2026-01-22 23:14:14.153045045 +0000 UTC m=+0.067957172 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.781 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.783 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.784 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.784 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.817 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.817 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:14 np0005592767 nova_compute[182623]: 2026-01-22 23:14:14.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:18 np0005592767 nova_compute[182623]: 2026-01-22 23:14:18.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:19 np0005592767 nova_compute[182623]: 2026-01-22 23:14:19.818 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:24 np0005592767 nova_compute[182623]: 2026-01-22 23:14:24.821 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:28 np0005592767 podman[249440]: 2026-01-22 23:14:28.136843564 +0000 UTC m=+0.055442468 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.823 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.825 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.844 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:29 np0005592767 nova_compute[182623]: 2026-01-22 23:14:29.845 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:34 np0005592767 podman[249464]: 2026-01-22 23:14:34.160997444 +0000 UTC m=+0.071236595 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 18:14:34 np0005592767 podman[249463]: 2026-01-22 23:14:34.18772258 +0000 UTC m=+0.095380048 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.846 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.848 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.886 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:34 np0005592767 nova_compute[182623]: 2026-01-22 23:14:34.887 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:38 np0005592767 podman[249509]: 2026-01-22 23:14:38.140087711 +0000 UTC m=+0.060399948 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 18:14:38 np0005592767 podman[249510]: 2026-01-22 23:14:38.168079763 +0000 UTC m=+0.073838529 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.888 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.890 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.937 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:39 np0005592767 nova_compute[182623]: 2026-01-22 23:14:39.938 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.940 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.941 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.941 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.942 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.984 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:44 np0005592767 nova_compute[182623]: 2026-01-22 23:14:44.985 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:45 np0005592767 podman[249554]: 2026-01-22 23:14:45.16891463 +0000 UTC m=+0.077355118 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.911 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.912 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.986 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.988 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.988 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:49 np0005592767 nova_compute[182623]: 2026-01-22 23:14:49.988 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:50 np0005592767 nova_compute[182623]: 2026-01-22 23:14:50.023 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:50 np0005592767 nova_compute[182623]: 2026-01-22 23:14:50.024 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.025 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.027 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.028 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.034 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.035 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.914 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.951 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.951 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.952 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:14:55 np0005592767 nova_compute[182623]: 2026-01-22 23:14:55.952 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.192 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.194 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5689MB free_disk=73.05134201049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.195 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.195 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.279 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.280 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.304 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.318 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.320 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:14:56 np0005592767 nova_compute[182623]: 2026-01-22 23:14:56.321 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:14:58 np0005592767 nova_compute[182623]: 2026-01-22 23:14:58.303 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:58 np0005592767 nova_compute[182623]: 2026-01-22 23:14:58.304 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:58 np0005592767 podman[249578]: 2026-01-22 23:14:58.878875574 +0000 UTC m=+0.045468126 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute)
Jan 22 18:14:59 np0005592767 nova_compute[182623]: 2026-01-22 23:14:59.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:14:59 np0005592767 nova_compute[182623]: 2026-01-22 23:14:59.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:15:00 np0005592767 nova_compute[182623]: 2026-01-22 23:15:00.036 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:01 np0005592767 nova_compute[182623]: 2026-01-22 23:15:01.898 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:02 np0005592767 nova_compute[182623]: 2026-01-22 23:15:02.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:05 np0005592767 nova_compute[182623]: 2026-01-22 23:15:05.038 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:05 np0005592767 podman[249600]: 2026-01-22 23:15:05.151923312 +0000 UTC m=+0.073408626 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Jan 22 18:15:05 np0005592767 podman[249599]: 2026-01-22 23:15:05.200070314 +0000 UTC m=+0.115921629 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 18:15:09 np0005592767 podman[249647]: 2026-01-22 23:15:09.131840474 +0000 UTC m=+0.055199262 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 22 18:15:09 np0005592767 podman[249648]: 2026-01-22 23:15:09.156257544 +0000 UTC m=+0.065297577 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.040 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.041 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:10 np0005592767 nova_compute[182623]: 2026-01-22 23:15:10.891 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:15:12.155 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:15:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:15:12.156 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:15:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:15:12.156 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:15:15 np0005592767 nova_compute[182623]: 2026-01-22 23:15:15.043 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:15 np0005592767 nova_compute[182623]: 2026-01-22 23:15:15.045 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:16 np0005592767 podman[249688]: 2026-01-22 23:15:16.154986602 +0000 UTC m=+0.070509365 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:15:16 np0005592767 nova_compute[182623]: 2026-01-22 23:15:16.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:20 np0005592767 nova_compute[182623]: 2026-01-22 23:15:20.046 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:20 np0005592767 nova_compute[182623]: 2026-01-22 23:15:20.047 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.049 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.050 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.050 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.051 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:25 np0005592767 nova_compute[182623]: 2026-01-22 23:15:25.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:26 np0005592767 nova_compute[182623]: 2026-01-22 23:15:26.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:26 np0005592767 nova_compute[182623]: 2026-01-22 23:15:26.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 22 18:15:27 np0005592767 nova_compute[182623]: 2026-01-22 23:15:27.057 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 22 18:15:29 np0005592767 podman[249713]: 2026-01-22 23:15:29.149286101 +0000 UTC m=+0.072529602 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 18:15:30 np0005592767 nova_compute[182623]: 2026-01-22 23:15:30.052 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:34 np0005592767 nova_compute[182623]: 2026-01-22 23:15:34.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:34 np0005592767 nova_compute[182623]: 2026-01-22 23:15:34.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.053 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.054 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.055 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:35 np0005592767 nova_compute[182623]: 2026-01-22 23:15:35.056 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:36 np0005592767 podman[249735]: 2026-01-22 23:15:36.202295086 +0000 UTC m=+0.114833282 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 22 18:15:36 np0005592767 podman[249734]: 2026-01-22 23:15:36.21765495 +0000 UTC m=+0.142610387 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:15:40 np0005592767 nova_compute[182623]: 2026-01-22 23:15:40.057 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:40 np0005592767 podman[249779]: 2026-01-22 23:15:40.131986503 +0000 UTC m=+0.049040018 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 18:15:40 np0005592767 podman[249780]: 2026-01-22 23:15:40.135691228 +0000 UTC m=+0.049970084 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 18:15:45 np0005592767 nova_compute[182623]: 2026-01-22 23:15:45.058 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:47 np0005592767 podman[249820]: 2026-01-22 23:15:47.183543923 +0000 UTC m=+0.097206199 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 18:15:50 np0005592767 nova_compute[182623]: 2026-01-22 23:15:50.060 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:50 np0005592767 nova_compute[182623]: 2026-01-22 23:15:50.062 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:51 np0005592767 nova_compute[182623]: 2026-01-22 23:15:51.910 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:51 np0005592767 nova_compute[182623]: 2026-01-22 23:15:51.910 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:15:51 np0005592767 nova_compute[182623]: 2026-01-22 23:15:51.910 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:15:53 np0005592767 nova_compute[182623]: 2026-01-22 23:15:53.494 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.061 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.063 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:15:55 np0005592767 nova_compute[182623]: 2026-01-22 23:15:55.064 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.919 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.920 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:15:57 np0005592767 nova_compute[182623]: 2026-01-22 23:15:57.920 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.092 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.093 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5695MB free_disk=73.03571701049805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.094 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.094 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.178 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.178 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.213 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.234 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.235 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:15:58 np0005592767 nova_compute[182623]: 2026-01-22 23:15:58.235 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.065 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.067 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.068 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:00 np0005592767 podman[249847]: 2026-01-22 23:16:00.165728621 +0000 UTC m=+0.081690981 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 18:16:00 np0005592767 nova_compute[182623]: 2026-01-22 23:16:00.235 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:01 np0005592767 nova_compute[182623]: 2026-01-22 23:16:01.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:01 np0005592767 nova_compute[182623]: 2026-01-22 23:16:01.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:16:02 np0005592767 nova_compute[182623]: 2026-01-22 23:16:02.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:03 np0005592767 nova_compute[182623]: 2026-01-22 23:16:03.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:05 np0005592767 nova_compute[182623]: 2026-01-22 23:16:05.069 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:07 np0005592767 podman[249870]: 2026-01-22 23:16:07.191310717 +0000 UTC m=+0.093832194 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Jan 22 18:16:07 np0005592767 podman[249869]: 2026-01-22 23:16:07.215010947 +0000 UTC m=+0.127172317 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.338 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:07 np0005592767 ceilometer_agent_compute[192304]: 2026-01-22 23:16:07.339 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.070 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.071 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.072 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.073 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:10 np0005592767 nova_compute[182623]: 2026-01-22 23:16:10.890 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:11 np0005592767 podman[249916]: 2026-01-22 23:16:11.16068285 +0000 UTC m=+0.069893287 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 18:16:11 np0005592767 podman[249917]: 2026-01-22 23:16:11.178931846 +0000 UTC m=+0.075062533 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:16:12.157 104135 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:16:12.157 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:16:12 np0005592767 ovn_metadata_agent[104130]: 2026-01-22 23:16:12.157 104135 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:16:15 np0005592767 nova_compute[182623]: 2026-01-22 23:16:15.074 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:17 np0005592767 nova_compute[182623]: 2026-01-22 23:16:17.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:18 np0005592767 podman[249959]: 2026-01-22 23:16:18.144980599 +0000 UTC m=+0.051726973 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:16:19 np0005592767 nova_compute[182623]: 2026-01-22 23:16:19.892 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.076 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.078 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.099 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:20 np0005592767 nova_compute[182623]: 2026-01-22 23:16:20.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:25 np0005592767 nova_compute[182623]: 2026-01-22 23:16:25.100 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:25 np0005592767 nova_compute[182623]: 2026-01-22 23:16:25.101 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.102 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.103 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.104 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:30 np0005592767 nova_compute[182623]: 2026-01-22 23:16:30.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:31 np0005592767 podman[249983]: 2026-01-22 23:16:31.180086292 +0000 UTC m=+0.086606939 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 18:16:35 np0005592767 nova_compute[182623]: 2026-01-22 23:16:35.105 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:38 np0005592767 podman[250004]: 2026-01-22 23:16:38.18306832 +0000 UTC m=+0.083442960 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 18:16:38 np0005592767 podman[250003]: 2026-01-22 23:16:38.203171469 +0000 UTC m=+0.110317391 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 18:16:40 np0005592767 nova_compute[182623]: 2026-01-22 23:16:40.106 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:42 np0005592767 podman[250052]: 2026-01-22 23:16:42.172916643 +0000 UTC m=+0.073765336 container health_status e75b847ab1aac540718b4b19ae96f3c8740054af634a52b0904a462714e58b0a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 18:16:42 np0005592767 podman[250051]: 2026-01-22 23:16:42.205944677 +0000 UTC m=+0.111458822 container health_status 67e23eaaf26209d65dd5b496d0a191b5b0d7ca87e15ae7cb343ae44a8657716c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.109 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.111 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.149 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:45 np0005592767 nova_compute[182623]: 2026-01-22 23:16:45.149 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:49 np0005592767 podman[250094]: 2026-01-22 23:16:49.166432212 +0000 UTC m=+0.088768861 container health_status a96ae03165fafaa05fea219945509fb006d9a1f750f4e702e8b11c476b4e4778 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 18:16:50 np0005592767 nova_compute[182623]: 2026-01-22 23:16:50.151 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:51 np0005592767 nova_compute[182623]: 2026-01-22 23:16:51.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:51 np0005592767 nova_compute[182623]: 2026-01-22 23:16:51.897 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 22 18:16:51 np0005592767 nova_compute[182623]: 2026-01-22 23:16:51.898 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 22 18:16:51 np0005592767 nova_compute[182623]: 2026-01-22 23:16:51.918 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.153 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.155 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.155 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.155 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.182 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:16:55 np0005592767 nova_compute[182623]: 2026-01-22 23:16:55.183 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 22 18:16:58 np0005592767 systemd-logind[802]: New session 74 of user zuul.
Jan 22 18:16:58 np0005592767 systemd[1]: Started Session 74 of User zuul.
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.899 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.946 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.946 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.947 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:16:59 np0005592767 nova_compute[182623]: 2026-01-22 23:16:59.947 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.168 182627 WARNING nova.virt.libvirt.driver [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.170 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5658MB free_disk=73.03567886352539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.170 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.170 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.184 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.185 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.274 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.275 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.300 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing inventories for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.428 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating ProviderTree inventory for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.429 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Updating inventory in ProviderTree for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.444 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing aggregate associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.472 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Refreshing trait associations for resource provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec, traits: HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.491 182627 DEBUG nova.compute.provider_tree [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.502 182627 DEBUG nova.scheduler.client.report [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Inventory has not changed for provider 8bcb2fa3-fdcb-43fc-af5e-81e3d5de2cec based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.504 182627 DEBUG nova.compute.resource_tracker [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 22 18:17:00 np0005592767 nova_compute[182623]: 2026-01-22 23:17:00.504 182627 DEBUG oslo_concurrency.lockutils [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 22 18:17:02 np0005592767 podman[250269]: 2026-01-22 23:17:02.152986694 +0000 UTC m=+0.071804491 container health_status cefffc37a27cdbfdb4b3aa5fc7e17e855890f392377bc0ec5e59fae33d53c20c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 18:17:02 np0005592767 nova_compute[182623]: 2026-01-22 23:17:02.501 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:17:02 np0005592767 nova_compute[182623]: 2026-01-22 23:17:02.502 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:17:02 np0005592767 nova_compute[182623]: 2026-01-22 23:17:02.502 182627 DEBUG nova.compute.manager [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 22 18:17:03 np0005592767 ovs-vsctl[250316]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 18:17:03 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 18:17:04 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 18:17:04 np0005592767 virtqemud[182095]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 18:17:04 np0005592767 nova_compute[182623]: 2026-01-22 23:17:04.897 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:17:05 np0005592767 nova_compute[182623]: 2026-01-22 23:17:05.186 182627 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 22 18:17:05 np0005592767 nova_compute[182623]: 2026-01-22 23:17:05.896 182627 DEBUG oslo_service.periodic_task [None req-7b3c1ae5-9f80-4db1-a1ea-555619bb71b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 22 18:17:07 np0005592767 systemd[1]: Starting Hostname Service...
Jan 22 18:17:07 np0005592767 systemd[1]: Started Hostname Service.
Jan 22 18:17:08 np0005592767 podman[250973]: 2026-01-22 23:17:08.877127828 +0000 UTC m=+0.077156373 container health_status e0abc22629c11534db6d0c0a99c9130228ce51e465e92e98afc91c0b572d1f74 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Jan 22 18:17:08 np0005592767 podman[250972]: 2026-01-22 23:17:08.909178294 +0000 UTC m=+0.109905719 container health_status ded3e6bf14f1d3afbcd6f9983306107aebd20d95b927511fa816e16835f75319 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd88697696f49756a6016ca9f39278e109a31ecbf10ee84e353fbf853fc4dc20c-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304-7697ff697b87a7a35440228ce4ebd0097a946374b738879b6d3f4802fe687304'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
